So master's theorem is T(n) = a*T(n/b) + f(n)
where a
is the amount of subproblems, n/b
is the size of the subproblems, and f(n)
is the amount of calculations you must due for every call.
a:
There are three subproblems, or three recursive calls,
n/b:
Each subproblem is fed at most 2/3
the original amount. You can prove it for each one, but to take an example: sort(list, i, j-k)
has a problem size from (j-k) - i
, which is j-(j-i+1)/3 - i = (2j - 2i - 1)/3
f(n):
is O(1)
, or bounded by constant time, since you're just doing two constant time switches.
This should come about to O(n^(log(3) base (2/3)))
Sorry about the math notation, I'm not sure how to use it on stack overflow.