Question

Given f(x, y) and g(n):

def f(x, y):
    if x < 1 or y < 1:
        return 1
    return f(x - 1, y - 1) + f(x - 1, y - 1)

def g(n):
    return f(n, n)

what is the Big Theta bound of g(n)?

I reasoned that since x == y, the conditional in f(x, y) is never true, so the 2 recursive calls will determine the complexity.

Considering only f(x - 1, y - 1): it takes n recursive calls to reach the base case, each call branches into another f(x - 1, y - 1). At this point I don't know how to proceed.

(The answer is Θ(2n).)

Was it helpful?

Solution

One way to solve this problem is to write a recurrence relation for the problem. If you'll notice, the arguments to f are always equal to one another (you can see this because they start the same in the call to g(n), and at that point are always equal). Therefore, we can write a recurrence relation T(n) that determines the runtime of f(n, n).

So what would T(n) be? Well, as a base case, T(0) would be 1, since the function does a constant amount of work as soon as n drops to 0. Otherwise, the function does a constant amount of work and then makes two recursive calls to problems of size n - 1. Therefore, we get this recurrence:

T(0) = 1

T(n+1) = 2T(n) + 1

Looking at the terms of the recurrence, we see a pattern:

  • T(0) = 1
  • T(1) = 3
  • T(2) = 7
  • T(3) = 15
  • T(4) = 31
  • ...
  • T(n) = 2n+1 - 1

You can formally prove this using induction if you'd like. Since T(n) is given by 2n+1 - 1, the runtime is Θ(2n).

Hope this helps!

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top