f'(x0) is the derivative of f evaluated at x0. You can compute an approximation to f' by evaluating:
f'(x0) ~ (f(x0+epsilon) - f(x0))/epsilon
for a suitably tiny value epsilon
(because f
is linear, any reasonable value of epsilon
will give essentially the same result; for more general functions f
the subtlety of choosing a good epsilon
to use is entirely too subtle to be discussed in a S.O. post -- enroll in an upper-division undergraduate numerical analysis course).
However, since you want to avoid "human" methods, I should point out that for the specific case of linear equations, Newton's method always converges in a single iteration, and is in fact essentially equivalent to the usual algebraic solution technique.
To illustrate this, consider your example. To use Newton's method, one needs to transform the equation so that it looks like f(x) = 0:
5x + 4 = 2x + 3
5x + 4 - (2x + 3) = 0
So f(x) = 5x + 4 - (2x + 3)
. The derivative of f(x)
is f'(x) = 5 - 2 = 3
. If we start with an initial guess x0 = 0
, then Newton's method gives us:
x1 = x0 - f(x0)/f'(x0)
= 0 - (5*0 + 4 - (2*0 + 3))/3
= 0 - (4-3)/3
= -1/3
This is actually exactly the same operations that a human would use to solve the equation, somewhat subtly disguised. Taking the derivative isolated the x
terms (5x - 2x = 3x
), and evaluating at zero isolated the terms without an x
(4-3 = 1
). Then we divided the constant coefficient by the linear coefficient and negated to get x
.