Question

I have been finding it difficult to calculate the time complexity of The binary GCD algorithm, also known as Stein's algorithm which is given to be O(n^2) where n is the number of bits in the larger of the two numbers. Shouldn't it be O(n) ? Algorithm is as below :

1.gcd(0, v) = v, because everything divides zero, and v is the largest number that divides v. Similarly, gcd(u, 0) = u. gcd(0, 0) is not typically defined, but it is convenient to set gcd(0, 0) = 0.

2.If u and v are both even, then gcd(u, v) = 2·gcd(u/2, v/2), because 2 is a common divisor.

3.If u is even and v is odd, then gcd(u, v) = gcd(u/2, v), because 2 is not a common divisor. Similarly, if u is odd and v is even, then gcd(u, v) = gcd(u, v/2).

4.If u and v are both odd, and u ≥ v, then gcd(u, v) = gcd((u − v)/2, v). If both are odd and u < v, then gcd(u, v) = gcd((v − u)/2, u). These are combinations of one step of the simple Euclidean algorithm, which uses subtraction at each step, and an application of step 3 above. The division by 2 results in an integer because the difference of two odd numbers is even.[3]

5.Repeat steps 2–4 until u = v, or (one more step) until u = 0. In either case, the GCD is 2kv, where k is the number of common factors of 2 found in step 2.

Was it helpful?

Solution

Knuth volume 2 has a very sophisticated analysis which seems to confirm the obvious guess that the number of steps is worst case linear in the number of input bits. However for very large n each subtraction needs to be charged as O(n) in its own right (for example due to multiple precision arithmetic) in which case the total bill is O(n^2)

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top