Question

Given the following recursive equation

$$ T(n) = 2T\left(\frac{n}{2}\right)+n\log n$$ we want to apply the Master theorem and note that

$$ n^{\log_2(2)} = n.$$

Now we check the first two cases for $\varepsilon > 0$, that is whether

  • $n\log n \in O(n^{1-\varepsilon})$ or
  • $n\log n \in \Theta(n)$.

The two cases are not satisfied. So we have to check the third case, that is whether

  • $n\log n \in \Omega(n^{1+\varepsilon})$ .

I think the third condition is not satisfied either. But why? And what would be a good explanation for why the Master theorem cannot be applied in this case?

Was it helpful?

Solution

The three cases of the Master Theorem that you refer to are proved in the Introduction to Algorithms by Thomas H. Cormen,Charles E. Leiserson,Ronald L. Rivest and Clifford Stein (2nd Edition, 2001).

It is correctly observed that the recurrence in question falls between Case 2 and Case 3. That is $f(n) = n \log n$ grows faster than $n$ but slower than $n^{1+\varepsilon}$ for any $\varepsilon > 0$.

However the theorem can be generalized to cover this recurrence. Consider

Case 2A: Consider $f(n) = \Theta (n^{\log_b a }\log_b^k n)$ for some $k\geq 0$.

This case reduces to Case 2 when $k = 0$. It is intuitively clear that along each branch of the recurrence tree $f(x)$ is being added $\Theta (\log_b n)$ times. A sketch of a more formal proof can be found below. The final result is that

$$T(n) = \Theta (n^{\log_b a }\log_b^{k+1} n)$$.

In the Introduction to algorithms this statement is left as an exercise.

Applying this statement to the recurrence in question we finally get $$ T(n) = \Theta(n \cdot \log^2 n). $$

More details on the Master Theorem can be found in the excellent (imho) Wikipedia page.

As @sdcvvc points in the comments to prove that Case 3 does not apply here one can invoke L'Hospital's rule that says that $$ \lim_{x\to c} \frac{f(x)}{g(x)} = \lim_{x\to c} \frac{f^\prime(x)}{g^\prime(x)} $$ for any functions $f(x)$ and $g(x)$ differentiable in the vicinity of $c$. Applying this to $f(n) = n \log n$ and $g(n) = n^{1+\varepsilon}$ one can show that $\log n \not\in \Theta (n^{1+\varepsilon}).$


Sketch of the Proof of the Master Theorem for Case 2A.

This is a reproduction of parts of the proof from Introduction to Algorithms with the necessary modifications.

First we prove the following Lemma.

Lemma A:

Consider a function $$ g(n) = \displaystyle \sum_{j=0}^{\log_b {n-1}} a^j h(n/b^j) $$ where $h(n) = n^{\log_b a} \log_b^k n.$ Then $g(n) = n^{\log_b a} \log_b^{k +1} n.$

Proof: Substituting the $h(n)$ into the expression for $g(n)$ one can get $$g(n) = \displaystyle n^{\log_b a} \log_b^k n \; \sum_{j=0}^{\log_b{n-1}} \left(\frac{a}{b^{\log_b a}}\right)^j = n^{\log_b a} \log_b^{k +1} n.$$

QED

If $n$ is an exact power of $b$ given a recurrence $$ T(n) = a T(n/b) + f(n),\quad T(1) = \Theta(1) $$ one can rewrite it as $$ T(n) = \Theta(n^{\log_b a}) + \displaystyle \sum_{j=0}^{\log_b n - 1} a^j T(n/b^j). $$ Substituting $f(n)$ with $\Theta(n^{\log_b a} \log_b^k n)$, moving $\Theta$ outside and applying Lemma A we get

$$T(n) = \Theta (n^{\log_b a }\log_b^{k+1} n).$$

Generalizing this to an arbitrary integer $n$ that is not a power of $b$ is beyond the scope of this post.

OTHER TIPS

The Akra-Bazzi theorem is a strict generalization of the master theorem. As a bonus its proof is a blizzard of integrals that will make your head spin ;-)

In any case, Sedgewick in his "Introduction to the analysis of algorithms" argues convincingly that one should strive to prove $T(n) \sim g(n)$ type asymptotics.

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top