Improving time complexity from O(log n/loglog n) to O((log ((nloglog n)/log n))/loglog ((nloglog n)/log n))

cs.stackexchange https://cs.stackexchange.com/questions/119338

Question

Suppose I have an algorithm whose running time is $O(f(n))$ where $f(n) = O\left(\frac{\log n}{\log\log n}\right)$

And suppose I can change this running time in $O(1)$ steps into $O\left(f\left(\frac{n}{f(n)}\right)\right)$, i.e. I can get an algorithm whose running time is $O(g(n)) = O\left(\frac{\log\frac{n}{\frac{\log n}{\log\log n}}} {\log\log\frac{n}{\frac{\log n}{\log\log n}}}\right) = O\left(\frac{\log\frac{n\log\log n}{\log n}} {\log\log\frac{n\log\log n}{\log n}}\right)$.

I'm pretty sure that $g(n) < f(n)$ for big enough $n$ (by using wolfram alpha) but wasn't able to prove it.

My questions are:

  1. Is $g(n) < f(n)$ in fact true (starting from some n)?

  2. Is $g(n)$ asymptotically better the $f(n)$, i.e. is $g(n) = o(f(n))$

  3. Assuming this is asymptotically better, I can do this step again and further improve the running time of the algorithm. Meaning that in 1 more step I can make my algorithm run in time of $O\left(\frac{n}{f\left(\frac{n}{f(n)}\right)}\right)$, and I can repeat this process as many times as I want. How many times should the process be repeated to get the best asymptotically running times and what will it be? obviously repeating it $O(f(n))$ times will already have a running time of $O(f(n))$ only for the repetition of this process and will not improve the overall algorithm complexity.

Was it helpful?

Solution

Here are the functions. $$f(n) = \frac{\log(n)}{\log\log(n)}$$ $$h(n)=\frac n{f(n)}=\frac{n\log\log(n)}{\log(n)}$$ $$g(n)=f(h(n))=\frac{\log(\frac{n\log\log(n)}{\log(n)})} {\log\log(\frac{n\log\log(n)}{\log(n)})}$$


  1. Is $g(n) < f(n)$ in fact true (starting from some $n$)?

Yes, here is a proof.

Let us compute the derivative of $f(x)$ with respect to $x$ as a real variable.

$$\frac{df}{dx} =\frac{\frac{\log\log x}{x}-\frac{\log x}{x\log x}}{(\log\log x)^2} =\frac{\log\log x-1}{x\ (\log\log x)^2}$$

So $\frac{df}{dx}\gt0$ when $e^e <x$. That means, $f(x)$ is strictly increasing once $e^e\lt x$.

If $n$ is large enough, $e^e\lt h(n) \lt n$, which implies $f(h(n)) < f(n)$, i.e., $g(n) < f(n)$.

If we want to be specific, we can show that $e^e\lt h(n) \lt n$ when $n > e^{e+2}$.

  1. Is $g(n)$ asymptotically better the $f(n)$, i.e. is $g(n) = o(f(n))$?

No, $g(n) = \Theta(f(n))$. In fact, $\displaystyle\lim_{n\to\infty}\frac{g(n)}{f(n)}=1$

  1. Assuming this is asymptotically better, I can do ...

Well, that assumption is invalid.


Here are two exercises.

Exercise 1. Show that $\displaystyle\lim_{n\to\infty}\frac{g(n)}{f(n)}=1$ using the following hint or steps.

  1. Recall that $\log(n) < n^\epsilon$ for any constant $\epsilon>0$ when $n$ is large enough.

  2. Show that $\displaystyle\lim_{n\to\infty}\frac{\log(\frac{n\log\log(n)}{\log(n)})}{\log(n)}=1$.

  3. Show that $\displaystyle\lim_{n\to\infty}\frac{\log\log(\frac{n\log\log(n)}{\log(n)})}{\log\log(n)}=1$.

Exercise 2. Show that $g(n)−f(n)=O(1)$. (This exercise might be hard.)

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top