Newton's method is a method for approximating the value of the roots of a function that cannot be solved for algebraically. Given the function f(x) and an estimate value for the root x0, the first approximation is

x_1 = x_0 - \frac{f(x_0)}{f'(x_0)}

The second is

x_2 = x_1 - \frac{f(x_1)}{f'(x_1)}

and in general

x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}

The more times this process is repeated, the better the approximation will be.

For best results, Newton's method should be initiated where f'(x_0) is sufficiently large, f''(x_0) is sufficiently small, and x_0 is close to the intended root. This will help avoid certain situations involving unexpected roots, or non-converging situation.


Suppose we are given the function

f(x) = e^{3x} + x^3

We will start with the approximation x0 = -0.5. The first approximation will be

x_1 = -0.5 - \frac{f(-0.5)}{f'(-0.5)} = -0.5 - \frac{e^{-1.5} + (-0.5)^3}{3e^{-1.5} + 3(-0.5)^2}= -0.569

The second will be

x_2 = -0.569 - \frac{f(-0.569)}{f'(-0.569)} = -0.569 - \frac{e^{-1.5} + (-0.569)^3}{3e^{-1.5} + 3(-0.569)^2}= -0.567

Plugging this into the original equation, we get

f(-0.567) = e^{3 \cdot -0.567} + (-0.567)^3 = 0.000217

The more approximations we make, the closer to zero the function will become.

Damped Newton's Method

To help improve convergence, newton's method may be dampened with a constant α from (0,1].

x_{n+1} = x_n - \alpha_n\frac{f(x_n)}{f'(x_n)}

Ideally, the each value of α should have the next iteration get as close to the root as possible. Ony possible method of determing α is the Band-Rose algorithm.[1]