Question

I have the following recursion:

T(n) = T(2*n / 3) + T(n / 3) + O(n log n)

I need to know the exact equation, I know Master theorem won't help me.

Please tell me how to do it in general for such recursions. I need complexity and to understand how to solve such problems.

Thanks in advance.

Was it helpful?

Solution

A generalization of the Master Theorem is the Akra-Bazzi method.

Assuming that your O(n log n) is really Θ(n log n), we have g(x)=x log x, ai=1 for i=1 and i=2, b1=2/3 and b2=1/3. Then b1p+b2p=1 when p=1, g(u)/up+1=(log u)/u has integral (log²u)/2, and T(x) is Θ(x log²x).

OTHER TIPS

You could use a simplification to get an upper bound.

1. Way:

T(n) = T(2/3 ⋅ n) + T(1/3 ⋅ n) + O(n log n)
     ≤ T(2/3 ⋅ n) + T(2/3 ⋅ n) + O(n log n)
     = 2 T(2/3 ⋅ n) + O(n log n)

Now you can use the master theorem. This should give you O( n1.7095 ). Due to we have a , we do have O instead of Θ.

2. Way:
You can 'count' the nodes in the calling-tree, which is a binary tree in your case. This tree has maximum depth log3/2(n), so it has less than 2O(log n) = O(n) nodes. For each node there comes a O(n log n), so you get O(n) ⋅ O(n log n) = O(n² log n) in total.

Sure, this is very unexact. We can do better if we split the tree up into two parts:

  1. The upper part above depth of 1/2 ⋅ log n: O(21/2 ⋅ log n ) = O( sqrt(n) ) nodes. This nodes have a weight of O(n log n).
  2. The lower part below depth of 1/2 ⋅ log n: O(2log n - sqrt(n) ) = O(n) nodes. This nodes have a weight of O(sqrt(n) log n)

In total you get: T(n) = O(sqrt(n)) ⋅ O(n log n) + O(n) ⋅ O(sqrt(n) log n) = O(n1.5 log n).

I hope this helps.

EDIT: O(n1.5 log n) ⊂ O(n1.7095).

EDIT: You can write O(n1.5 log n) as O(n⋅sqrt(n)⋅log n). As @Teepeemm showed, the exact complexity is Θ(n log²(n)), which improves the upper bound I showed by replacing the factor sqrt(n) by log(n).

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top