Question

Notice that I am asking for little-o here (see similar question here) - for big Oh it's clearly wrong - for little-o it feels right but can't seem to prove it...

EDIT: glad I raised a debate :) Assume f,g > 0 for simplicity

Was it helpful?

Solution

It is, at least if g(n) is converging to positive infinity for n to positive infinity (if g(n) isn't there are easy to find counterexamples).

Sketch of a proof:

Prerequsites: g(n) is converging to positive infinity for n to positive infinity.

f(n) in o(g(n)) means:

for every eps > 0 exists a n0 so that for all n > n0 abs(f(n)) < abs(g(n)*eps).

Form this follows:

2^abs(f(n)) < 2^abs(g(n)*eps) = (2^eps)^g(n) (for n > n0).

For eps < 1:

(2^eps)^n is in o(2^n) (as 2^eps < 2) 

it follows that for every eps2 > 0 exists a n1 so that for all n > n1

(2^eps)^n < eps2*(2^n).

Choosing eps2 = eps vields:

(2^eps)^n < eps*(2^n) for all n > n1 (n1 is dependent on eps)

Because g(n) -> pos. inf. for n -> pos. inf. there exists a n2 so that

g(n) > n1 for all n > n2

Following from there is

(2^eps)^g(n) < eps*2^g(n) for all n > n2.

So for every 0 < eps < 1 there exists a n3 >= max(n0,n2) so that

2^abs(f(n)) < 2^abs(g(n)*eps) = (2^eps)^g(n) < eps*2^g(n) for all n > n3.

For every eps3 > 1 also:

2^abs(f(n)) < eps*2^g(n) < eps3*2^g(n)

So for every eps > 0 there exists a n3 so that

2^abs(f(n)) < eps*2^g(n) for all n > n3

Becase 2^f(n) < 2^abs(f(n)) for all n, and 2^x > 0 for all real x, it follows

abs(2^f(n)) < abs(eps*2^g(n)) for all n > n3

q.e.d.

If something is unclear or wrong, please comment.

EDIT: Some thoughts on other g(n):

A subsequence of g(n) is restricted i.e. it exists some x so that there isn't an n0 with abs(g(n))>=x for all n > n0:

o(g(n)) consists only of functions that are constant 0 after some n or converge to 0. Then 2^g(n) also has a restricted subsequence, but 2^f(n) is constant 1 after some point.

There is no n0 so g(n) > 0 for all n > n0:

2^g(n) < 1 if g(n) < 0, so g(n) has a restricted subsequence meaning o(2^g(n)) consists only of functions that are constant 0 after some n or converge to 0.

OTHER TIPS

Here's an answer. The result depends on the convergence properties of g(n).

First consider the related limit:

lim(x->inf) log_2 ((2^(f(n))) / (2^(g(n))))
=
lim(x->inf) ( log_2(2^(f(n))) - log_2(2^(g(n))) )
=
lim(x->inf) ( f(n) - g(n) ) = lim(x->inf) ( g(n) * f(n) / g(n) - g(n) )
=
lim(x->inf) ( -g(n) ) = - lim(x->inf) g(n)

... Now, to get this into the form of the original question in your post, IF it is mathematically correct to switch the limit and the log (which I think it is), then we would have:

lim(x->inf) log_2 ((2^(f(n))) / (2^(g(n))))
=
log_2 lim(x->inf) ((2^(f(n))) / (2^(g(n)))).

Now, moving on to answer the question. If it is true that

2^(f(n)) = o(2^(g(n))),

then in the limit, the right-hand side becomes

log_2 (0) = - inf

... so in this scenario it must be true that

lim(x->inf) g(n) = inf

This result does seem to make sense. Consider f(x) = exp(-x) and g(x) = 1 - exp(-x). Clearly, in this example f(x) = o(g(x)). However, 2^(f(x)) is not o(2^(g(x))) because

lim(x->inf) (2^exp(-x)) / (2^(1-exp(-x))) = 1/2.

But given f(x) = exp(x), g(x) = exp(2x), where also f(x) = o(g(x)) and where lim(x->inf) g(x) = inf, meeting the above condition, we have

lim(x->inf) (2^exp(x)) / (2^exp(2x))
=
lim(x->inf) 1 / 2^(exp(x)*(exp(x) - 1)) = 0

so 2^(f(x)) = o(2^(g(x))).

Easy Proof with an example,

If f(n) = O(g(n)),
2^(f(n)) not equal to O(2^g(n)))

Let, f(n) = 2log n and g(n) = log n
(Assume log is to the base 2)

We know, 2log n <= c(log n) therefore f(n) = O(g(n))

2^(f(n)) = 2^log n^2 = n^2
2^(g(n)) = 2^log n = n

We know that
n^2 is not O(n)

Therefore, 2^(f(n)) not equal to O(2^g(n)))

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top