Pergunta

Entropy is the randomness collected by an operating system or application for use in Cryptography or other uses that require random data. The formula for Entropy is $$H(p_1, ..., p_k)=-\sum_{i=1}^{k} p_i\log_2(p_i)[bit]$$

So if I were to calculate the Entropy of a coin toss. it would be $$H(\frac{1}{2}, \frac{1}{2})=-(\log_2(\frac{1}{2})+\frac{1}{2}\log_2(\frac{1}{2}))=-(0-1)=1 Bit$$

But why is there a $\frac{1}{2}$ before the $\log$? Also if I were doing an experiment where the probability of an outcome is $\frac{1}{3}$ and there are $3$ outcomes, so would the entropy be

$$H(\frac{1}{3}, \frac{1}{3}, \frac{1}{3})=(\log_2(\frac{1}{3})+\frac{1}{3}\log_2(\frac{1}{3})+\frac{1}{3}\log_2(\frac{1}{3}))$$

Nenhuma solução correta

Licenciado em: CC-BY-SA com atribuição
Não afiliado a cs.stackexchange
scroll top