I'm trying to figure out the following problem:

If algorithm $A$ has a big theta notation of $n^3$ and algorithm $B$ has a big theta notation of $n^2$, there might be an infinite number of inputs for which $A$ has a lower running time than $B$. Is the statement correct?

My guess is that it's not correct, because big theta notation is bounded from both sides, therefore $n^3$ can't be faster than $n^2$ for any input.

Is my theory correct?

Thanks

有帮助吗?

解决方案

The answer depends on what you mean exactly by "an algorithm has a big-theta notation". Asymptotic notations is used to denote set of functions, not algorithms. It is important to know what function you are talking about.

If given an algorithm $Alg$, you denote by $f_{Alg}(n)$ the maximum running time of $Alg$ among all inputs of size $n$, your statement is false. Let $B$ be an algorithm that just executes an empty loop for $n^3$ iterations. Let $A$ be an algorithm that checks whether the first bit of the input is even or odd. If it is even, it executes an empty loop for $n^2$ iterations. If it is odd, it terminates immediately. According to the above choices, $f_A(n) = \Theta(n^3)$, and $f_B(n) = \Theta(n^2)$, yet there are infinite inputs for which $A$ runs faster than $B$ (i.e., essentially all input strings with an odd bit).

In this sense saying that $f_{Alg}(n) = \Omega(g(n))$ means that there is a constant $c>0$ such that, for every sufficiently large $n$, there is at least one input of size $n$ such that $f_{Alg}(n) \ge c g(n)$. With this interpretation you cannot say that an algorithm $X$ that takes $2^n$ time on inputs of even even length and $n$ time on inputs of odd length, has a time complexity of $\Omega(2^n)$.

That said, sometimes $f_{Alg}(n) = \Omega(g(n))$ is used in the following weaker sense: there is a constant $c>0$ such that, for every sufficiently large $n$, there is at least one input of size $n' \ge n$ such that $f_{Alg}(n') \ge c g(n')$. In this sense you would say that $X$ has time complexity of $\Omega(2^n)$.

Less commonly, $f_{Alg}(n) = \Omega(g(n))$ is used in the following stronger sense: for every sufficiently large $n$, and for every input of size $n$, $f_{Alg}(n) \ge c g(n)$. In this sense, $f_A$ as defined above would not even belong to $\Omega(n^3)$. This notion is not really useful, since it is often very easy to design algorithms for which the best lower-bound on their running time according to the above interpretation is $\Omega(n)$ (by checking if the input belongs to some class of inputs for which the solution is trivially known).

许可以下: CC-BY-SA归因
不隶属于 cs.stackexchange
scroll top