Rates in algorithm analysis? [closed]
Pregunta
WHY do logarithms grow slower than any polynomial? What is the (understandable) proof for this?
Similarly,
WHY do exponentials always grow faster than any polynomial?
Solución
EDIT: This answer is essentially doing what PengOne said.
We take the limit of
log_2(x) / x^p
for constant p > 0 and show that the limit is zero. Since both log_2(x) and x^p go to infinity as x grows without bound, we apply l'Hopital's rule. This means our limit is the same as the limit of
1/(x*ln2) / p*x^(p-1)
Using simple rules of fractions, we reduce this to
1 / (p * x^p * ln2)
Since the denominator goes to infinity while the numerator is constant, we can evaluate the limit - it's zero, which means that log_2(x) grows asymptotically more slowly than x^p, regardless of the (positive) value of p.
EDIT2:
By the way, if you are interested in this question and the posted answers, consider showing support for the new Computer Science StackExchange site by following this link and committing to the movement:
Otros consejos
Given two (nonnegative) real-valued functions f
and g
, you want to compute
lim_{x -> infinity} f(x) / g(x)
This limit is:
0
if and only iff
grows slower thang
infinity
if and only iff
grows faster thang
c
for some constant0 < c < infinity
if and only iff
andg
grow at the same rate
Now you can take any examples you like and compute the limits to see which grows faster.
You could consider the derivatives.
d(x^n)/dx = nx^(n-1)
d(ln x)/dx = 1/x
for n >= 1 nx^(n-1) increases with x or stays the same, whereas 1/x decreases with x, so the polynomial grows quicker.
The logarithm of e^x is x, whereas the logarithm of n^x is n ln x, so using the above argument to compare the logarithm of e^x and the logarithm of x^n, e^x grows quicker.