Simulating a probability of 1 of 2^N with less than N random bits
Pergunta
Say I need to simulate the following discrete distribution:
$$ P(X = k) = \begin{cases} \frac{1}{2^N}, & \text{if $k = 1$} \\ 1 - \frac{1}{2^N}, & \text{if $k = 0$} \end{cases} $$
The most obvious way is to draw $N$ random bits and check if all of them equals to $0$ (or $1$). However, information theory says
$$ \begin{align} S & = - \sum_{i} P_i \log{P_i} \\ & = - \frac{1}{2^N} \log{\frac{1}{2^N}} - \left(1 - \frac{1}{2^N}\right) \log{\left(1 - \frac{1}{2^N}\right)} \\ & = \frac{1}{2^N} \log{2^N} + \left(1 - \frac{1}{2^N}\right) \log{\frac{2^N}{2^N - 1}} \\ & \rightarrow 0 \end{align} $$
So the minimum number of random bits required actually decreases as $N$ goes large. How is this possible?
Please assume that we are running on a computer where bits is your only source of randomness, so you can't just tose a biased coin.
Nenhuma solução correta