Question

Suppose I have the following from alphabet $\mathcal{X} = \{0 ,1\}$ to $\mathcal{Y} = \{0 ,1\}$. The channel simply does

\begin{align} 0 \rightarrow 0&\quad \text{with probability 1} \\ 1 \rightarrow 0&\quad \text{with probability 1/2}\\ 1 \rightarrow 1&\quad \text{with probability 1/2} \end{align}

The classical capacity of this channel is given by the mutual information between the input register, $X$ and output register $Y$ maximized over input probability distributions, $p(x)$. That is,

$$C = \max_{p(x)} H(Y) - H(Y|X)$$

The appropriate distribution turns out to be $(0.6, 0.4)$ and the capacity is $0.3219$ bits.

How does one actually use this result to practically communicate (in the asymptotic limit) over this channel at this rate? Does it mean that I should use input codewords that obey these statistics over the input alphabet for example?

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top