Update: sqrt4
has better chances of eventually exiting the loop than sqrt5
because its stopping condition compares the approximation from one iteration with one from the iteration before it.
The computation tends to reduce the error, so that eventually the value computed for b
is so close to the exact square root that x/b
differs from b
only in the last bit. At this point the value computed for (x / b + b) / 2
using the finite precision available will equal b
, and the iteration stops.
For example, if we are calculating sqrt(2) and have reached the approximation b = 1.414213562373095, we have:
>>> b
1.414213562373095
>>> 2/b # Close but not quite equal to b,
1.4142135623730951 # iteration in sqrt5 continues
>>> (2/b + b)/2
1.414213562373095 # Exactly equal to b, sqrt4 stops
As you can see, once b has reached 1.414213562373095 its value will not be changed by the calculation in the loop. Because b and 2/b still differ in the last digit sqrt5
will never exit.
The Babylonian and Heron's methods are the same algorithm, and it coincides with the Newton-Rhapson method for solving x²=a. The difference between the various implementations you have is the stopping condition. For example:
while (b != h) {
b = (h + b) / 2;
h = x / b;
}
is the same as:
while (b != x/b) {
b = (x / b + b) / 2;
}
Of course, b != x/b
is not a very good stopping condition because b
may never become the mathematically exact square root of x: if b is not the exact square root, x/b does not equal b. For example in Python:
>>> sqrt(2)
1.4142135623730951
>>> 2/sqrt(2)
1.4142135623730950
You should almost always use a bound on the relative difference as the stopping condition, for example
eps = 1e-6;
while (abs(b-h)/b > eps) { ...