Question

Assume I have a list of functions, for example

$\qquad n^{\log \log(n)}, 2^n, n!, n^3, n \ln n, \dots$

How do I sort them asymptotically, i.e. after the relation defined by

$\qquad f \leq_O g \iff f \in O(g)$,

assuming they are indeed pairwise comparable (see also here)? Using the definition of $O$ seems awkward, and it is often hard to prove the existence of suitable constants $c$ and $n_0$.

This is about measures of complexity, so we're interested in asymptotic behavior as $n \to +\infty$, and we assume that all the functions take only non-negative values ($\forall n, f(n) \ge 0$).

Was it helpful?

Solution

If you want rigorous proof, the following lemma is often useful resp. more handy than the definitions.

If $c = \lim_{n\to\infty} \frac{f(n)}{g(n)}$ exists, then

  • $c=0 \qquad \ \,\iff f \in o(g)$,
  • $c \in (0,\infty) \iff f \in \Theta(g)$ and
  • $c=\infty \quad \ \ \ \iff f \in \omega(g)$.

With this, you should be able to order most of the functions coming up in algorithm analysis¹. As an exercise, prove it!

Of course you have to be able to calculate the limits accordingly. Some useful tricks to break complicated functions down to basic ones are:

  • Express both functions as $e^{\dots}$ and compare the exponents; if their ratio tends to $0$ or $\infty$, so does the original quotient.
  • More generally: if you have a convex, continuously differentiable and strictly increasing function $h$ so that you can re-write your quotient as

    $\qquad \displaystyle \frac{f(n)}{g(n)} = \frac{h(f^*(n))}{h(g^*(n))}$,

    with $g^* \in \Omega(1)$ and

    $\qquad \displaystyle \lim_{n \to \infty} \frac{f^*(n)}{g^*(n)} = \infty$,

    then

    $\qquad \displaystyle \lim_{n \to \infty} \frac{f(n)}{g(n)} = \infty$.

    See here for a rigorous proof of this rule (in German).

  • Consider continuations of your functions over the reals. You can now use L'Hôpital's rule; be mindful of its conditions²!

  • Have a look at the discrete equivalent, Stolz–Cesàro.
  • When factorials pop up, use Stirling's formula:

    $\qquad \displaystyle n! \sim \sqrt{2 \pi n} \left(\frac{n}{e}\right)^n$

It is also useful to keep a pool of basic relations you prove once and use often, such as:

  • logarithms grow slower than polynomials, i.e.

    $\qquad\displaystyle (\log n)^\alpha \in o(n^\beta)$ for all $\alpha, \beta > 0$.

  • order of polynomials:

    $\qquad\displaystyle n^\alpha \in o(n^\beta)$ for all $\alpha < \beta$.

  • polynomials grow slower than exponentials:

    $\qquad\displaystyle n^\alpha \in o(c^n)$ for all $\alpha$ and $c > 1$.


It can happen that above lemma is not applicable because the limit does not exist (e.g. when functions oscillate). In this case, consider the following characterisation of Landau classes using limes superior/inferior:

With $c_s := \limsup_{n \to \infty} \frac{f(n)}{g(n)}$ we have

  • $0 \leq c_s < \infty \iff f \in O(g)$ and
  • $c_s = 0 \iff f \in o(g)$.

With $c_i := \liminf_{n \to \infty} \frac{f(n)}{g(n)}$ we have

  • $0 < c_i \leq \infty \iff f \in \Omega(g)$ and
  • $c_i = \infty \iff f \in \omega(g)$.

Furthermore,

  • $0 < c_i,c_s < \infty \iff f \in \Theta(g) \iff g \in \Theta(f)$ and
  • $ c_i = c_s = 1 \iff f \sim g$.

Check here and here if you are confused by my notation.


¹ Nota bene: My colleague wrote a Mathematica function that does this successfully for many functions, so the lemma really reduces the task to mechanical computation.

² See also here.

OTHER TIPS

Another tip: sometimes applying a monotone function (like log or exp) to the functions makes things clearer.

Skiena provides a sorted list of the dominance relations between the most common functions in his book, The Algorithm Design Manual:

$$n!\gg c^n \gg n^3 \gg n^2 \gg n^{1+\epsilon} \gg n \lg n \gg n \gg n^{1/2}$$ $$ \gg \lg^2n \gg \lg n \gg \frac{\lg n}{\lg\lg n} \gg \lg\lg n \gg \alpha(n) \gg 1$$

Tip: draw graphs of these functions using something like Wolfram Alpha to get a feeling for how they grow. Note that this is not very precise, but if you try it for sufficiently large numbers, you should see the comparative patterns of growth. This of course is no substitute for a proof.

E.g., try: plot log(log(n)) from 1 to 10000 to see an individual graph or plot log(log(n)) and plot log(n) from 1 to 10000 to see a comparison.

I suggest proceeding according to the definitions of various notations. Start with some arbitrary pair of expressions, and determine the order of those, as outlined below. Then, for each additional element, find its position in the sorted list using binary search and comparing as below. So, for example, let's sort $n^{\log\log n}$ and $2^n$, the first two functions of n, to get the list started.

We use the property that $n = 2^{\log n}$ to rewrite the first expression as $n^{\log\log n} = (2^{\log n})^{\log\log n} = 2^{\log n\log\log n}$. We could then proceed to use the definition to show that $n^{\log\log n} = 2^{\log n\log\log n} \in o(2^n)$, since for any constant $c > 0$, there is an $n_0$ such that for $n \geq n_0$, $c(n^{\log\log n}) = c(2^{\log n\log\log n}) < 2^n$.

Next, we try $3^n$. We compare it to $2^n$, the largest element we have placed so far. Since $3^n = (2^{\log 3})^n = 2^{n\log3}$, we similarly show that $2^n \in o(3^n) = o(2^{n \log 3})$.

Etc.

Here a list from Wikipedia, The lower in the table the bigger complexity class; $$ \begin{array}{|l|l|} \hline Name & \text{Running Time} \\ \hline \text{Constant time} & \mathcal{O}(1) \\ \text{Inverse Ackermann time} & \mathcal{O}(a(n)) \\ \text{Iterated logarithmic time} & \mathcal{O}(\log^*n) \\ \text{Log-logarithmic} & \mathcal{O}(n \log n) \\ \text{Logarithmic time} & \mathcal{O}(\log n) \\ \text{Polylogarithmic time} & poly(\log n) \\ \text{Fractional power} & \mathcal{O}(n^c) ,\text{where } 0<c<1 \\ \text{Linear time} & \mathcal{O}(n) \\ \text{"n log star n" time} & \mathcal{O}(n \log^* n) \\ \text{Quasilinear time} & \mathcal{O}(n \log n) \\ \text{Quadratic time} & \mathcal{O}(n^2) \\ \text{Cubic time} & \mathcal{O}(n^3) \\ \text{Polynomial time} & poly(n) = 2^{\mathcal{O}(\log n)} \\ \text{Quasi-polynomial time} & 2^{\mathcal{O}(poly(\log n))} \\ \text{Sub-exponential time (first definition)} & \mathcal{O}(2^{n^{\epsilon}}), \epsilon >0 \\ \text{Sub-exponential time (second definition)} & 2^{\mathcal{o}(n)}\\ \text{Exponential time(with linear exponent)} & 2^{\mathcal{O}(n)} \\ \text{Exponential time} & 2^{poly(n)} \\ \text{Factorial time} & \mathcal{O}(n!) \\\hline \end{array} $$

note : $poly(x) = x^{\mathcal{O}(1)}$

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top