Question

The example in this answer proves the fact familiar to CS students - that the "big-O" is not a total order. However, most algorithm running times analyzed using big-Oh notation are not expressed in piecewise form like this example. In fact, most algorithms I am familiar with have a running time expressed in terms of polynomials, exponentiations and logs.

Consider the recursively defined class of functions which includes $f(n) = c$ for any constant $c$, $f(n) = n$, and any functions of the form $f + g, f \cdot g, \log(f), \exp(f)$ where $f,g$ are in the class. Does $O$ impose an ordered partition on this class of functions? The functions with the same big-$O$ growth is in the same part.

Here are my thoughts:

Note that specifying $f \cdot g$ is actually redundant, since $f \cdot g = \exp(\log(f) + \log(g))$. Since the functions are inductively defined, perhaps there is an inductive proof.

Was it helpful?

Solution

This was shown by Hardy in his monograph Orders of Infinity.

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top