Question

(log n)^k = O(n)? For k greater or equal to 1.

My professor presented us with this statement in class, however I am not sure what it means for a function to a have a time complexity of O(n). Even stuff like n^2 = O(n^2), how can a function f(x) have a run time complexity?

As for the statement how does it equal O(n) rather than O((logn)^k)?

Was it helpful?

Solution

(log n)^k = O(n)?

Yes. The definition of big-Oh is that a function f is in O(g(n)) if there exist positive constants N and c, such that for all n > N: f(n) <= c*g(n). In this case f(n) is (log n)^k and g(n) is n, so if we insert that into the definition we get: "there exist constants N and c, such that for all n > N: (log n)^k <= c*n". This is true so (log n)^k is in O(n).

how can a function f(x) have a run time complexity

It doesn't. Nothing about big-Oh notation is specific to run-time complexity. Big-Oh is a notation to classify the growth of functions. Often the functions we're talking about measure the run-time of certain algorithms, but we can use big-Oh to talk about arbitrary functions.

OTHER TIPS

f(x) = O(g(x)) means f(x) grows slower or comparably to g(x).

Technically this is interpreted as "We can find an x value, x_0, and a scale factor, M, such that this size of f(x) past x_0 is less than the scaled size of g(x)." Or in math:

|f(x)| < M |g(x)| for all x > x_0.

So for your question:

log(x)^k = O(x)? is asking : is there an x_0 and M such that log(x)^k < M x for all x>x_0.

The existence of such M and x_0 can be done using various limit results and is relatively simple using L'Hopitals rule .. however theres probably a nice non-calculus proof somewhere...

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top