Pergunta

Consider a sequence of $n$ flips of an unbiased coin. Let $H_i$ denote the absolute value of the excess of the number of heads over tails seen in the first $i$ flips. Define $H=\text{max}_i H_i$. Show that $E[H_i]=\Theta ( \sqrt{i} )$ and $E[H]=\Theta( \sqrt{n} )$.

This problem appears in the first chapter of `Randomized algorithms' by Raghavan and Motwani, so perhaps there is an elementary proof of the above statement. I'm unable to solve it, so I would appreciate any help.

Foi útil?

Solução

Your coin flips form a one-dimensional random walk $X_0,X_1,\ldots$ starting at $X_0 = 0$, with $X_{i+1} = X_i \pm 1$, each of the options with probability $1/2$. Now $H_i = |X_i|$ and so $H_i^2 = X_i^2$. It is easy to calculate $E[X_i^2] = i$ (this is just the variance), and so $E[H_i] \leq \sqrt{E[H_i^2]} = \sqrt{i}$ from convexity. We also know that $X_i$ is distributed roughly normal with zero mean and variance $i$, and so you can calculate $E[H_i] \approx \sqrt{(2/\pi)i}$.

As for $E[\max_{i \leq n} H_i]$, we have the law of the iterated logarithm, which (perhaps) leads us to expect something slightly larger than $\sqrt{n}$. If you are good with an upper bound of $\tilde{O}(\sqrt{n})$, you can use a large deviation bound for each $X_i$ and then the union bound, though that ignores the fact that the $X_i$ are related.

Edit: As it happens, $\Pr[\max_{i \leq n} X_i = k] = \Pr[X_n = k] + \Pr[X_n = k+1]$ due to the reflection principle, see this question. So $$ \begin{align*} E[\max_{i \leq n} X_i] &= \sum_{k \geq 0} k(\Pr[X_n = k] + \Pr[X_n = k+1]) \\ &= \sum_{k \geq 1} (2k-1) \Pr[X_n = k] \\ &= \sum_{k \geq 1} 2k \Pr[X_n = k] - \frac{1}{2} + \frac{1}{2}\Pr[X_n = 0] \\ &= E[H_n] + \Theta(1), \end{align*} $$ since $\Pr[H_n = k] = \Pr[X_n = k] + \Pr[X_n = -k] = 2\Pr[X_n = k]$. Now $$ \frac{\max_{i \leq n} X_i + \max_{i \leq n} (-X_i)}{2} \leq \max_{i \leq n} H_i \leq \max_{i \leq n} X_i + \max_{i \leq n} (-X_i), $$ and therefore $E[\max_{i \leq n} H_i] \leq 2E[H_n] + \Theta(1) = O(\sqrt{n})$. The other direction is similar.

Outras dicas

You can use the half-normal distribution to prove the answer.

The half-normal distribution states that if $X$ is a normal distribution with mean 0 and variance $\sigma^2$, then $|X|$ follows a half distribution with mean $\sigma \sqrt{\frac{2}{\pi}}$, and variance $\sigma^2 (1-2/\pi)$. This gives the required answer, since the variance $\sigma^2$ of the normal walk is $n$, and you can approximate the distribution of $X$ to a normal distribution using the central limit theorem.

$X$ is the sum of the random walk as Yuval Filmus mentioned.

In first $2i$ flips, suppose we get $k$ tails, then $H_{2i}=2|i-k|$. Therefore, \begin{align} E(H_{2i}) & =2\sum_{k=0}^{i}\binom{2i}{k}(\frac{1}{2})^{2i}\cdot 2(i-k)\\ & = (\frac{1}{2})^{2i-2}\left[i\sum_{k=0}^{i}\binom{2i}{k}-\sum_{k=0}^ik\binom{2i}{k}\right]\\ & = (\frac{1}{2})^{2i-2}\left[i\left(2^{2i}+\binom{2i}{i}\right)/2-2i\sum_{k=0}^{i-1}\binom{2i-1}{k}\right]\\ & = (\frac{1}{2})^{2i-2}\cdot i\left[2^{2i-1}+\binom{2i}{i}/2-2\cdot 2^{2i-1}/2\right]\\ & = 2i\cdot\binom{2i}{i}/2^{2i}. \end{align} Use Stirling's approximation, we know $E(H_{2i})=\Theta(\sqrt{2i})$.

Licenciado em: CC-BY-SA com atribuição
Não afiliado a cs.stackexchange
scroll top