Question

As far as I can tell the logic in this makes sense. Yet the output is incorrect and I can seem to make sense of it.

#include <stdio.h>

int gcd(int, int);

int main()
{
    int n, m;

    printf("enter two numbers");
    scanf("%d%d", &n, &m);

    printf("The gcd of %d and %d is: %d \n", n, m, gcd(n,m));

    return 0;
}

int gcd(int x, int y)
{
    while(x!=y){
           if(x>y)
                    return (x-y,y);

            else
                    return(x,y-x);
    }
    return x;
}
Was it helpful?

Solution

That's not a recursive implementation of gcd, the function is not calling itself and besides it's using a while loop. A truly recursive approach will look like this:

int gcd(int x, int y) {
    if (y == 0)
        return x;
    else
        return gcd(y, x % y);
}

Or a bit shorter:

int gcd(int x, int y) {
    return y == 0 ? x : gcd(y, x % y);
}

The above implementation is based on the Euclidean algorithm, refer to this for more details.

OTHER TIPS

return (x-y,y);

will simply return y. That code compiles, but probably does not what you expected.

A non-recursive Euclid GCD algorithm would look like this:

int gcd (int x, int y)
{
    while (y != 0)
    {
        int r = x % y;
        x = y;
        y = r;
    }
    return x;
}

Compared with a recursive version :

int gcd (int x, int y)
{
    return y == 0 ? x : gcd (y, x%y);
}

As always in these trivial examples, the recursive approach uses less source code but is inefficient.

Copies of x and y plus the return address are passed on the stack, while the linear version simply works with an intermediate variable r for the remainder of x / y.

Even if some compilers might be smart enough to detect the unnecessary recursion and optimize it away, I think it is useful to understand the inherent cost of recursive coding.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top