Question

Assume that f(n) goes to infinity as n goes to infinity.

This is a homework problem and I would appreciate an idea/guidance instead of the complete answer.

Was it helpful?

Solution

This isn't true. Consider the function f(n) = n! as a counterexample, which definitely goes toward infinity as n goes to infinity. We can prove, though, that n! ≠ O((n - 1)!).

The proof is by contradiction. Suppose that n! = O((n - 1)!). Then there exists some n0 and c such that for any n ≥ n0, we have n! ≤ c(n - 1)!. This means that for any n ≥ n0, we have that n! / (n - 1)! ≤ c, or that n ≤ c. But if we pick n = max{n0, c} + 1, then we know that n ≥ n0 and that n ≥ c + 1, contradicting that n ≤ c. Since we have a contradiction, the assumption must be wrong and therefore n! ≠ O((n - 1)!).

In case you're wondering how I came up with this: my idea was to find a function that grows so rapidly that no matter what constant you picked, the ratio between f(n + 1) and f(n) would eventually get so large that it would exceed that constant. It just happened to be the case that n! fit the bill. In retrospect, I should have remembered that n! ≠ O((n - 1)!) because many algorithms have runtimes like O((n + 1)!), which doesn't simplify down to O(n!).

Hope this helps!

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top