You could use a simplification to get an upper bound.
1. Way:
T(n) = T(2/3 ⋅ n) + T(1/3 ⋅ n) + O(n log n)
≤ T(2/3 ⋅ n) + T(2/3 ⋅ n) + O(n log n)
= 2 T(2/3 ⋅ n) + O(n log n)
Now you can use the master theorem. This should give you O( n1.7095 )
. Due to we have a ≤
, we do have O
instead of Θ
.
2. Way:
You can 'count' the nodes in the calling-tree, which is a binary tree in your case. This tree has maximum depth log3/2(n)
, so it has less than 2O(log n) = O(n)
nodes. For each node there comes a O(n log n)
, so you get O(n) ⋅ O(n log n) = O(n² log n)
in total.
Sure, this is very unexact. We can do better if we split the tree up into two parts:
- The upper part above depth of
1/2 ⋅ log n
: O(21/2 ⋅ log n ) = O( sqrt(n) )
nodes.
This nodes have a weight of O(n log n)
.
- The lower part below depth of
1/2 ⋅ log n
: O(2log n - sqrt(n) ) = O(n)
nodes.
This nodes have a weight of O(sqrt(n) log n)
In total you get: T(n) = O(sqrt(n)) ⋅ O(n log n) + O(n) ⋅ O(sqrt(n) log n) = O(n1.5 log n)
.
I hope this helps.
EDIT: O(n1.5 log n) ⊂ O(n1.7095)
.
EDIT: You can write O(n1.5 log n)
as O(n⋅sqrt(n)⋅log n)
. As @Teepeemm showed, the exact complexity is Θ(n log²(n))
, which improves the upper bound I showed by replacing the factor sqrt(n)
by log(n)
.