Question

Lets say we are trying to solve some algorithmic problem $A$ that is dependent on input of size $n$. We say algorithm $B$ that runs in time $T(n)$, is asymptotically better than algorithm $C$ which runs in time $G(n)$ if we have: $G(n) = O(T(n))$, but $T(n)$ is not $O(G(n))$.

My question is related to the asymptotic running time of graph algorithms, which is usually dependent on $|V|$ and $|E|$. Specifically I want to focus on Prim's algorithm. If we implement the priority queue with a binary heap the run-time would be $O(E\log V)$. With Fibonacci heap we could get a run-time of $O(V\log V + E)$.

My question is do we say that $O(V\log V + E)$ is asymptotically better than $O(E\log V)$?

Let me clarify: I know that if the graph is dense the answer is yes. But if $E=O(V)$ both of the solutions are the same. I am more interested in what is usually defined as an asymptotic improvement in the case we have more than one variable, and even worse - the variables are not independent ($V-1\le E<V^2$, since we assume the graph is connected for Prim's algorithm).

Thanks!

Was it helpful?

Solution

The most permissive definition is as follows.

Suppose that $f(V,E),g(V,E)$ are two running times of graph algorithms solving the same problem.

We say that $f$ is an asymptotic improvement in some regime on $g$ if there exists a sequence $V_n,E_n$ with $V_n \to \infty$ such that $$ \lim_{n\to\infty} \frac{f(V_n,E_n)}{g(V_n,E_n)} = 0. $$

Sometimes the regime is not deemed interesting, but that's a more subjective matter.

Note also that the problem already manifests itself for one-variable functions. Consider $$ f(n) = n^2, \qquad g(n) = \begin{cases} 2^n &\text{if } n = 2^m, \\ n & \text{otherwise}. \end{cases} $$ Is $f$ an asymptotic improvement over $g$? It is for inputs of a certain length, and is indeed so under our permissive definition above.

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top