Question

Whenever I see a recursive solution, or I write recursive code for a problem, it is really difficult for me to figure out the time complexity, in most of the cases I just say its exponential? How is it exponential actually? How people say it is 2^n, when it is n!, when it is n^n or n^k.

I have some questions in mind -

  1. let say find all permutations of a string (O(n!))
  2. find all sequences which sum up to k in an array (exponential, how exactly do I calculate).
  3. Find all subsets of size k whose sum is 0 (will k come somewhere in complexity , it should come right?).

Can any1 help me how to calculate the exact complexity of such questions, I am able to wrote code for them, but its hard understanding the exact time complexity.

Was it helpful?

Solution

find all sequences which sum up to k in an array (exponential, how exactly do I calculate).

Let F(a, n, k) be the number of all subsets of S ⊂ {0, 1, ..., n-1} so that

 ∑ a[i] == k
i∈S

Then we can compute F(array, length of array, k) recursively by splitting the problem in two subproblems (for n > 0).

The subsets of {0, 1, ..., n-1} can be partitioned into two classes, those that contain n-1 and those that don't.

We obtain the recursion

F(a,n,k) = F(a,n-1,k) + F(a,n-1, k-a[n-1])

Let T(n) be the time necessary to compute F(_,n,_) (the underscores indicating that T(n) does only depend on n, not on the array or on k [although for specific arrays and k faster algorithms are possible]. The recursion for F then immediately implies

T(n) = 2 * T(n-1)

for n > 0.

For n == 0, we can compute the solution in constant time,

F(a, 0, k) = k == 0 ? 1 : 0

so we have T(n) = 2^n * T(0) inductively.

If the subsets shall not only be counted, but output, the complexity becomes O(n * 2^n) and that bound is tight (for an array of all 0s, with k == 0, all subsets meet the condition, and printing them takes Θ(n * 2^n) time).

Find all subsets of size k whose sum is 0 (will k come somewhere in complexity , it should come right?).

Yes, the complexity of that problem depends on n and k.

Let F(a,n,k,s) be the number of subsets S ⊂ {0, 1, ..., n-1} of cardinality k such that

 ∑ a[i] == s
i∈S

For k == 0, we again have a constant time answer, there is one such subset (the empty set) if s == 0, and none otherwise. For k > n the set {0, 1, ..., n-1} has no subsets of cardinality k, so F(a,n,k,s) = 0 if k > n.

If 0 < k <= n, we can, like above, consider the subsets containing n-1 and those that don't separately, giving

F(a,n,k,s) = F(a,n-1,k,s) + F(a,n-1,k-1,s - a[n-1])

and for the time complexity we find

T(n,k) = T(n-1,k) + T(n-1,k-1)

That recursion is known from the binomial coefficients, and we have

T(n,k) = n `choose` k = n! / (k! * (n-k)!)

(with T(n,0) = 1).

Once again, if the sets shall not only be counted, but output, the time complexity increases, here all sets have cardinality k, so it becomes

k * n! / (k! * (n-k)!)

OTHER TIPS

Take a look at Master Theorem. It is perfectly explained in prof. Tim Roughgarden's Algorithms: Design and Analysis: Part I, from Stanford. This are online classes, which I recommend, but you can watch the videos without enrolling to the course. There is also part II of this course, if you are interested.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top