A graph $G = (V, E)$ is called an $(n, d, \varepsilon)$-expander if the graph has $n$ vertices, maximum degree $d$, and satisfies the following expansion property:

  • for every subset $W\subset V$ such that $|W|\leq n/2$, $|W\cup N(W)|\ge (1+\varepsilon)|W|$

where $N(W)$ is the neighborhood of $W$ in $G$. The expansion property means, informally, that the neighborhood of $W$ is "large" when $W$ is "small". That expanders exist can be shown with the probabilistic method (see, e.g., this writeup from Ellis).

I'm trying to show that expanders can be used to amplify the success probability of one-sided Monte Carlo algorithms. In particular, given a polynomial time algorithm $A$ for a problem in RP that uses $n$ random bits and gives false negatives with probability at most $1/2$ we want to show there is a polynomial time algorithm that uses $n$ random bits and gives false negatives with probability at most $n^{-c}$ for a fixed constant $c$.

Here is the method I've been asked to analyze. We regard $A$ as a function $A(x,r)$ where $x$ is the input and $r$ is a string representing the random bits. On the input $x$ we then:

  • Fix a $(2^n,d,\varepsilon)$-expander $G$ and identify its vertices with $\{0,1\}^n$
  • Select a vertex $r$ of $G$ uniformly at random
  • Compute the set $y_1,y_2,\ldots, y_t$ of vertices at distance at most $\delta$ from $r$ in $G$
  • Say "no" if and only if $A(x,r),A(x,y_1), \ldots, A(x,y_t)$ are all "no"

What I need to show is that for $\delta = O(\log n)$ the probability of a false negative is at most $n^{-c}$.

As a side note, this is actually enough to accomplish the original goal. A result of Gabber and Galil is that there is a family of $(2^n, 5,(2 - \sqrt{3})/4)$-expanders with the additional property that the neighbors of any vertex can be enumerated in polynomial time. This means that $d$ and $\varepsilon$ can be taken uniformly in $n$, and that we can implement BFS from $r$ in polynomial time per vertex considered. Since $t\le d^{\delta + 1}$, the new algorithm runs $A$ a number of times polynomial in $n$ for $\delta = O(\log n)$ and $n$ is polynomial in the input size. Since $G$ is deterministic and does not depend on $A$, there is also clearly no new randomness.


Here is how I approached the problem thus far.

Let $G$ be given. Since $|V| = 2^n$, we have an identification $V \cong \{0,1\}^n$. Let $r \in \{0,1\}^n$ be chosen at random so that $r$ corresponds to a random starting point in $G$. Define a ball $B(v,\delta) = \{u \in V: d(u,v) \leq \delta\}$ where $d(u,v)$ is the length of the shortest $uv$-path. We are only interested in the number of bad balls, namely balls such that all $u\in B(v,\delta)$ result in false negatives, that is, our $\mathsf{RP}$ algorithm should accept but reject nonetheless. If we can bound above the number of bad balls by $2^n/n^c$ for some constant $c$, then we are done since an error occur if and only if the chosen vertex $y_i$ is such that $B(y_i,\delta)$ is bad.

I tried bounding the number of vertices in a bad ball using the expansion property of $G$, but I am not sure how this could help considering balls are certainly not disjoint. We also know that at most $1/2$ of the vertices will result in a false negative.

Any help would be greatly appreciated.

没有正确的解决方案

许可以下: CC-BY-SA归因
不隶属于 cs.stackexchange
scroll top