Question

In the field of Cryptography and Computation Complexity there is a notion of negligible function.

I have some difficulties in understanding intuition behind this notion. The following are some definitions from Chapter 9. Cryptography from the textbook Computation Complexity. A modern approach by Arora and Barak with extensive use of negligible function. There my question after every definition about negligible function.

Before proceeding further, we make a simple definition that will greatly simplify notation throughout this chapter.

Definition of negligible function. a function $\epsilon : \mathbb{N} \rightarrow [0,1]$ is called negligible if $\epsilon(n)=n^{-\omega(1)}$.

Because negligible functions tend to zero very fast as their input grows, events that happen with negligible probability can be safely ignored in most practical and theoretical settings.

So far so good, it's just the definition of negligible function, the only point is why do we need to care about this function if it "can be safely ignored".

The notion of computational secure function.$k \in_R \{0,1\}^n, x \in_R \{0,1\}^m, Pr [A(E_k(x))=(i,b) s.t. x_i=b] \leq \frac{1}{2}+\epsilon(n)$.

Less intuitive usage of negligible function. As I understood, in general, $A$ can with probability 0.5 guess uniformly distributed $x_i$, therefore it makes sence to expected lower bound of success to be $\leq \frac{1}{2}$, however it's $\leq \frac{1}{2} + \epsilon(n)$, it we can "safely ignore" $\epsilon(n)$ why to mention it, and the second point is it possible to run $A$ some fixed finite number of times to get probability infinitely close to 1?

Definition of one-way function. $x\in_R\{0,1\}^n, y=f(x), Pr[A(y)=x' s.t. f(x')=y] < \epsilon(n)$

In this case, the usage of negligible function is very intuitive, the success probability is upper bounded by negligible function $\epsilon(n)$. I am not sure how it's correlated with existence of computationally secure encryption scheme (of course =0 is preferable by encryption scheme), however if $\epsilon(n)$ can be safely ignored than it's ok.

The problem is I am not quite understand why do we need negligible function. By mentioning few definition I tried to be more specific about what exactly I don't understand.

I would appreciate if anyone can shed the light on the usage of negligible function

Was it helpful?

Solution

If you repeat an Algorithm $A$ with negligible probability $\epsilon(n)$ of success for an input of length $n$ the number of times until the first success happens is a geometrically distributed random variable $S$. The expected value of $S$ is

$$\mathbb{E}(S) = \frac{1}{\epsilon(n)} = n^{\omega(1)} \quad .$$

So an attacker using $A$ polynomial many times (say $p(n)$) can't have non negligible probability of success, which we'll see by comparing expected values. Let $A'$ be the algorithm which repeats $A$ $p(n)$ times and the random variable which represents the number of times $A'$ has to be repeated until the first success shall be named $S'$. Then:

$$\mathbb{E}(S) \leq p(n) \mathbb{E}(S')\quad,$$

so a non negligible success probability for $A'$ would mean a non negligible probability for $A$, since the product of two polynomials is again polynomial. So allowing negligible probabilities does no harm (at least for polynomial bounded adversaries).

This is essentially contains an indirect proof that $p(n)\epsilon(n)$ is still negligible for any polynomial $p$, which can also be proven directly: $p$ is dominated by its largest exponent, so $p(n)\in \mathcal O(n^c)$ for some $c \in\mathbb{N}$. On the other hand $\epsilon(n)\in o(n^{-d})$ for all $d\in\mathbb{N}$. Define $d'=d-c$ and you'll get $\epsilon(n)\in o(n^{-(d'+c)})$ for all $d'\in\mathbb{N}$. So $p(n)\epsilon(n)\in o(n^{-(d'+c)})\mathcal O(n^c)=o(n^{-d'})$ for all $d'\in\mathbb{N}$, which is again negligible.

If you insist on a probability of $0$ (or $\frac{1}{2}$ in the case of distinguishers) many applications become (nearly) impossible. Consider a cryptographic hash function $h$ and an unknown argument $x$. The probability of finding a preimage $x'$ s.t. $h(x')=h(x)$ taken over all $x$ can't be $0$, because testing all possible inputs will lead to success eventually. You could however choose to limit the probability with $2^{-n^{\Omega(1)}}$ or similar and call only those functions negligible. But most of the time only unbounded adversaries or efficient (i.e. polynomially bounded) adversaries are considered, so the different notions wouldn't matter, as an efficient adversary can break none and and unbounded can break both.

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top