Question

I am analyzing this block of code to review how to calculate the worst-case theoretical running time. I am using the Master Theorem. Could someone give me a step-by-step solution as to how to arrive at the running time?

def sort(list, i, j):
    if list[i] > list[j]:
        list[j], list[i] = list[i], list[j]
    if i + 1 >= j:
        return
    k = (j - i + 1)/3
    sort(list, i, j - k)
    sort(list, i + k, j)
    sort(list, i, j - k)
Was it helpful?

Solution

So master's theorem is T(n) = a*T(n/b) + f(n) where a is the amount of subproblems, n/b is the size of the subproblems, and f(n) is the amount of calculations you must due for every call.

a: There are three subproblems, or three recursive calls,

n/b: Each subproblem is fed at most 2/3 the original amount. You can prove it for each one, but to take an example: sort(list, i, j-k) has a problem size from (j-k) - i, which is j-(j-i+1)/3 - i = (2j - 2i - 1)/3

f(n): is O(1), or bounded by constant time, since you're just doing two constant time switches.

This should come about to O(n^(log(3) base (2/3))) Sorry about the math notation, I'm not sure how to use it on stack overflow.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top