Question

Consider this implementation of Euclid's algorithm:

function gcd(a, b)
    while b ≠ 0
       t := b
       b := a mod b
       a := t
    return a

A nice proof on wikipedia shows that the algorithm "always needs less than O(h) divisions, where h is the number of digits in the smaller number b".

However, on a Turing Machine a procedure that computes a mod b has a time complexity of O(a+b). My intuition and some large tests tell me that the complexity for Euclid's algorithm is still O(a+b) on a Turing Machine.

Any thoughts on how to prove that?

Was it helpful?

Solution

If I were implementing GCD on binary numbers on a Turing machine, I'd implement the following algorithm. I can't see how it would be smaller than O((ln a + ln b)^2) though. The most efficient representation I think would be bitwise interleaving both values after step 2.

  1. Let s1 = the number of zeros in the least significant bits of a. Delete these bottom s1 zero bits.
  2. Let s2 = the number of zeros in the least significant bits of b. Delete these bottom s2 zero bits.
  3. Let s = min(s1,s2)
  4. Now a and b are both odd. If b < a then swap a and b.
  5. b >= a. Set b = b - a, then delete all the least significant zero bits from b.
  6. If b != 0, goto 4.
  7. Add s zero bits onto the end of a. This is the result.
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top