Question

We can define the F - measure as follows:

$F_{\alpha}=\frac{1}{\alpha \frac{1}{P}+(1-\alpha)\frac{1}{R}} $

Now we might be interested in choosen a good $\alpha$. In the article The truth of the F-measure the author states that one can choose the conditions:

$\beta=R/P$, where $\frac{\partial F_{\alpha}}{\partial P}=\frac{\partial F_{\alpha}}{\partial R}$

and then we obtain $\alpha=1/(\beta^2+1)$ and

$F_{\beta}=\frac{(1+\beta^2)PR}{\beta^2 P+R} $

It is said that The motivation behind this condition is that at the point where the gradients of E w.r.t. P and R are equal, the ratio of R against P should be a desired ratio $\beta$. I understand that the condition will garantify that the user is willing to trade an increment in precision for an equal loss in recall. But I do not get why the equality of both partial derivatives correspond to these hypothesis. I would rather understand when one partial derivative equals the other partial derivative multiplied by minus one. Could anyone explain me why the desired condition (condition in words) correspond to this equality (condition in math terms)?

EDIT:

Well, we could do the following:

$\partial F=\frac{\partial F_{\alpha}}{\partial P}\partial P+\frac{\partial F_{\alpha}}{\partial R}\partial R$.

And since we want for $\partial P=-\partial R$ that $\partial F=0$, we obtain easily the condition.

But I have one problem with this: Since $\frac{\partial F_{\alpha}}{\partial P}/\frac{\partial F_{\alpha}}{\partial R}=1$, and the fact that the gradient is perpendicular to each level curve (https://ocw.mit.edu/courses/mathematics/18-02sc-multivariable-calculus-fall-2010/2.-partial-derivatives/part-b-chain-rule-gradient-and-directional-derivatives/session-36-proof/MIT18_02SC_pb_32_comb.pdf) we would have that the level curve must have $m=-1$. Nonetheless, when I calculate the level curve for some constant $c$ I get the result,

$R(P)=\frac{c(1-\alpha ) P}{P-c\alpha}$,

which clearly is not a linear funcion with $m=-1$. What am I missing?

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with datascience.stackexchange
scroll top