Question

How I understand the master theorem, an algorithm can be defined recursively as:

a T(n/b) + O(n^d)

Where a is the number of subproblems, n/b is the size of the subproblems, and O(n^d) is the recombination time of the subproblems. Calculating the time complexity of the master theorem goes as follows:

T(n) =  { O(n^d)                    if d > log base b of a
        {
        { O(n^d log n)              if d = log base b of a
        {
        { O(n^ (log base b of a) )  if d < log base b of a

My question is, what if the recombination time is not expressed in the form O(n^d)? Such as O(2^n) or O(log(n)). How would one determine the time complexity?

Was it helpful?

Solution

With the Master theorem you wouldn't, it only applies to recurrences of a certain form. You say:

How I understand the master theorem, an algorithm can be defined recursively as:

But this is not entirely accurate - not all algorithms can be defined recursively like that, as you have shown yourself. The Master theorem only applies to those that can be defined like that (more specifically, see here for all the cases it can be applied to).

For other forms, there are other theorems, such as this.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top