質問

We use log-likelihood (called as lambda) to reduce the risk of numerical underflow (in context of sentiment analysis using Naive Bayes).

What does "reduce the risk of numerical underflow" means?

役に立ちましたか?

解決

Arithmetic underflow can happen if the result of a calculation is a number smaller than absolute value than the computer can actually represent as a fixed length (fixed precision) binary digit. Instead returning the actual result of the calculation, the computer returns a zero.

Arithmetic underflow can happen in many statistical and machine learning models when many small likelihoods are multiplied together.

Taking the log of each likelihood increases the absolute value so arithmetic underflow is less likely to happen.

他のヒント

This problem is not limited to naive bayes but it is same with all probabilistic models. It happens when we need to multiply lot of small numbers and we are not able to show the whole result with precision so to solve that we use log instead.

But sometimes we need to calculate sum of products of probability and for that we also use log-sum-exp trick.

Refer this for log-sum-exp.

ライセンス: CC-BY-SA帰属
所属していません datascience.stackexchange
scroll top