Domanda

I'm taking Data Structures and Algorithm course and I'm stuck at this recursive equation:

T(n) = logn*T(logn) + n

obviously this can't be handled with the use of the Master Theorem, so I was wondering if anybody has any ideas for solving this recursive equation. I'm pretty sure that it should be solved with a change in the parameters, like considering n to be 2^m , but I couldn't manage to find any good fix.

È stato utile?

Soluzione

The answer is Theta(n). To prove something is Theta(n), you have to show it is Omega(n) and O(n). Omega(n) in this case is obvious because T(n)>=n. To show that T(n)=O(n), first

  1. Pick a large finite value N such that log(n)^2 < n/100 for all n>N. This is possible because log(n)^2=o(n).
  2. Pick a constant C>100 such that T(n)<Cn for all n<=N. This is possible due to the fact that N is finite.

We will show inductively that T(n)<Cn for all n>N. Since log(n)<n, by the induction hypothesis, we have:

T(n) < n + log(n) C log(n) 
     = n + C log(n)^2
     < n + (C/100) n 
     = C * (1/100 + 1/C) * n
     < C/50 * n
     < C*n

In fact, for this function it is even possible to show that T(n) = n + o(n) using a similar argument.

Altri suggerimenti

This is by no means an official proof but I think it goes like this.

The key is the + n part. Because of this, T is bounded below by o(n). (or should that be big omega? I'm rusty.) So let's assume that T(n) = O(n) and have a go at that.

Substitute into the original relation

T(n) = (log n)O(log n) + n
     = O(log^2(n)) + O(n)
     = O(n)

So it still holds.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top