Question

def f(L):
    if len(L) < 1 billion:
        return L
    else:
        return f(L[:len(L) // 2]) + f(L[len(L) // 2:])

L is a list of size n

I know that if it was a single recursive call, then it would be O(logn), but there are two recursive calls here.

But it started to exhibit more of a O(n) runtime as I began to run it on a visualizer.

In my opinion it should be O(logn+logn) = O(2logn) = O(logn). Am I correct?

Was it helpful?

Solution

Consider how many calls you're doing. At the first level of the recursion you'll do 2 calls. For each of those you'll do two more calls. Etc ... This means that at level i of the recursion you'll have made a total of O(2^i) function calls.

How many levels of the recursion are there? This is just the height of a binary tree with n elements, which is O(log_2 n).

So by the time you reach all the leaves of the recursion you will have done O(2^(log_2 n)) = O(n) function calls.

--

Another way of looking at it is that you eventually have to piece back together the entire list, so how could you possibly do that in less than O(n) time?

OTHER TIPS

Your algorithm as it stands is going to be O(n) if len(L) is at least 1 billion because you will break the list into two, and then add the two halves back together. Both slicing and adding are O(n) operations.

If you want to test the runtime of the two recursive calls,

1. Pass in a start and end index, and call

f(L, start, start+(end-start)//2) + f(L, start+(end-start)//2, end)

2. Return end-start or some other O(1) value when end-start is less than 1 billion

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top