Question

I'm sitting here with this assignment in a course on algorithms with massive data sets and the use of Little-Oh notation has got me confused, although I'm perfectly confident with Big-Oh.

I do not want a solution to the assignment, and as such I will not present it. Instead my question is how I interpret the time complexity o(log n)?

I know from the definition, that an algorithm A must grow asymptotically slower than o(log n), but I'm uncertain as to whether this means that the algorithm must be running in constant time or if it is still allowed to be log n under certain conditions, such that c = 1 or if it is really log (n-1).

Say an algorithm has a running time of O(log n) but in fact does two iterations and as such c = 2, but 2*log n is still O(log n), am I right when I say that this does not hold for Little-Oh?

Any help is greatly appreciated and if strictly needed for clarification, I will provide the assignment

Was it helpful?

Solution

To say the f is 'little-oh-of g' f = o(g), means that the quotient

|f(x)/g(x)|

approaches 0 as x approaches infinity. Referring to your example of o(log n), that class contains functions like log x / log (log x), sqrt(log x) and many more, so o(log x) definitely doesn't imply O(1). On the other hand, log (x/2) = log x - log 2, so

log (x/2) / log x = 1 - log 2 / log x -> 1

and log (x/2) is not in the class o(log x).

OTHER TIPS

For Little-Oh, f(x) does not have to be smaller than g(x) for all x. It has to be smaller only after a certain value of x. (For your question, it is still allowed to be log n under certain conditions.)

For example:

 let f(x) = 5000x and g(x) = x^2

f(x) / g(x) as x approaches infinity is 0, so f(x) is litte-o of g(x). However, at x = 1, f(x) is greater than g(x). Only after x becomes greater than 5000 will g(x) be bigger than f(x).

What little-o really tells us is that g(x) always grows at a faster rate than f(x). For example, look how much f(x) grew between x = 1 and x = 2:

 f(1) = 5000
 f(2) = 10000 - f(x) became twice as big

Now look at g(x) on the same interval:

 g(1) = 1
 g(2) = 4 - g(x) became four times bigger

This rate will increase even more at bigger values of x. Now, since g(x) increases at a faster rate and because we take x to the infinity, at some point it will become larger than f(x). But this is not what little-o is concerned with, it's all about the rate of change.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top