Question

Here is my code below; I suck at analyzing recursive functions; I think it's O(2^n); what do you guys think? (I know you can write it better, my question is about the run time OF THE CODE BELOW).

static int ans = 0;
  public static void ablehelper(int a, int b, int c, int d){
      if(a != c && (b + a) > c){
          return;
      }
      if(b != d && (b + a) > d){
          return;
      }
      if(a == c && b == d){
          ans = 1;
          return;
      }
      ablehelper(a + b, b, c, d);
      ablehelper(a, b + a, c, d);

  }
  public static String able(int a, int  b, int c, int d){
      ablehelper(a, b, c, d);
      if(ans == 1){
          return "Able to generate";
      }else{
          return "Not table to generate";
      }
  }
Was it helpful?

Solution

First, note that the variable n has no defined meaning in the context of this problem, so it makes no sense to express the runtime of your algorithm as a function of n.

Secondly, note that your algorithm only has reasonable terminating behavior if

  • a and b are positive OR
  • even for the initial a, b we have a + b > min{c, d}

The latter case is not particularly interesting, so let's concentrate on the former case. In particular we have a, b >= 1 in this case.

The pair (c, d) stays fixed during every recursion, so let's see what the nature of the pairs (a, b) is that your algorithm enumerates. Let A and B be the initial values of a and b. We have that in every function call, (a, b) = (x1 * A + y1 * B, x2 * A + y2 * B). The transitions are

((x1 + x2) * A + (y1 + y2) * B, x2 * A + y2 * B)
(x1 * A + y1 * B, (x1 + x2) * A + (y1 + y2) * B)

We can show that the function f(x1, y1, x2, y2) = (x1 + x2, y1 + y2) is injective when restricted to the values (x1, y1, x2, y2) that occur during the execution of the algorithm.

If we look at the complete recursion tree, without the prunings, f adopts the following values:

(1,1)
(1,2)
(1,3)
...
(2,1)
(2,3)
(2,5)
(2,7)
...
(3,1)
(3,2)
(3,4)
(3,5)
(3,7)
...

Notice the pattern? We can show that f adopts exactly the value pairs (x,y) with gcd(x,y) = 1 (the coprime pairs). Furthermore, because of the pruning, we only enumerate those pairs (x,y) with Ax + By <= min{c,d}. The runtime T of your algorithm, expressed as a function of A, B, c and d is thus

enter image description here

At this point I'm not sure how to establish the asymptotics of this function precisely. A quick-and-dirty upper bound that one gets after removing the co-primeness constraint and the floors, is T(A, B, c, d) = O(min{c,d}^2 / (A * B)) = O(min{c,d}^2). Not sure how tight this bound is.

UPDATE: In your CR post, you propose a much better algorithm to solve the problem. It still contains a bug, but it's a good start:

public static String betterSolution(int a, int b, int c, int d){
    while( c > a || d > b){
        if(c > d){
            if (d == 0) break; // otherwise, it does not terminate!
            c = c-d;
        }else{
            if (c == 0) break; // otherwise, it does not terminate!
            d = d-c;
        }
    }
    if( c == a &&  d == b){
        return "Able to generate";
    }else{
        return "Not able to generate";
    }
}

This algorithm is still only polynomial in the variables, O(max{c,d}) to be precise. You can improve it to logarithmic time by "skipping" over steps:

public static String evenBetterSolution(int a, int b, int c, int d){
    while( c > a || d > b){
        if(c > d){
            if (d == 0) break; 
            c = c - Math.max(1, (c - a) / d) * d;
        }else{
            if (c == 0) break;
            d = d - Math.max(1, (d - b) / c) * c;
        }
    }
    if( c == a &&  d == b){
        return "Able to generate";
    }else{
        return "Not able to generate";
    }
}

The algorithm is based on the Euclidean GCD algorithm and the runtime is bounded by O(log(min{c,d})). Very good.

OTHER TIPS

This reminds me of the recent TopCoder problem, so I will assume the missing piece of the statement comes from there. So, I'll answer the following question:

Let a, b, c and d be integers from 1 to n, inclusive. What is the time complexity of the coded algorithm with respect to n?

I'd say it's O(n^2). See how c and d remain constant, but a and b grow. If (the worst case) a = b = 1 and c = d = n, we will visit each pair (a, b) where a + b <= n no more than once. To see why, consider a pair (a, b) and look back on how could it have been generated. It turns out that a pair (a, b) where a > b can be only generated by (a - b, b), and if a < b it can be only generated by (a, b - a). Otherwise, one of the values would have been negative, and since we start with positive a and b, they remain positive during the whole process.

More precisely, when we start with a = b = 1, we visit each pair (a, b) where the greatest common divisor of a and b is 1 exactly once, and don't visit any other pair. The number of such pairs is ~6n^2/PI^2 which is Θ(n^2).

Let's say:

a = -1 b = 0 c = 0 d = 0

These values will skip the first if, the second if and the third if and will go straight to

ablehelper(a + b, b, c, d);

It turns out -1+0 is -1, and thus, we start over from the beginning and this has no end (but a stack overflow exception).
Regarding the complexity, it doesn't make much sense like Marco Acierno pointed out.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top