int f(int x)
{
  if (x < 1) return 1;

  return f(x-1) + g(x);
}


int g(int x)
{
  if (x < 2) return 1;

  return f(x-1) + g(x/2)
}

What is big-O of f? More importantly, what technique is used to calculate runtime for problems like this?

有帮助吗?

解决方案

Lets write Cf(x) (resp Cg(x)) the number of addition performed when calling f(x) (resp g(x)).

First of all both function are returning some number which are obtained by addition going back ultimately to 1. Therefore

Cf(x) = f(x) - 1
Cg(x) = g(x) - 1

So let's stick to f and g. Here are the first few values:

[(f(i), g(i), 2^i) for i in range(10)]
[(1, 1, 1),
 (2, 1, 2),
 (5, 3, 4),
 (11, 6, 8),
 (25, 14, 16),
 (53, 28, 32),
 (112, 59, 64),
 (230, 118, 128),
 (474, 244, 256),
 (962, 488, 512)]

Looks exponential. Moreover:

f(x) = f(x-1) + g(x) 
     = 2*f(x-1) + g(x/2)

This clearly indicate that

f(x) > 2*f(x-1) > 4*f(x-2) > 8*f(x-3) > 2^x. 

So you are good betting that f(x) is a O(2^x), actually a Theta(2^x).

Now f(x) > 2^x and f(x-1) <= g(x) <= f(x). So that g and f grows at the same rate. As a consequence g(x/2) is completely negligible compared to f(x). So that

f(x) is a O(2^n)
许可以下: CC-BY-SA归因
不隶属于 StackOverflow
scroll top