Newton's Method


Let's say we want to find f(x)=0f(x) = 0 for some function, where solving explicitly is impossible. but you can evaluate f(x)f(x) and f(x)f^\prime(x)

You might come up with the clever idea of approximating f(x)f(x) by a tangent line, then finding where the line hits zero.

f(x1)f(x1)\frac{f(x_1)}{f^\prime(x_1)} is the base because derivative is slope, which gives us the base bb

f(x)=f(x)b    b=f(x)f(x)f'(x) = \frac{f(x)}{b} \implies b = \frac{f(x)}{f'(x)}

Intuitively dxdyf(x1)\frac{dx}{dy} f(x_1), is converting some change in Y to the corresponding change in X. Using this we can now find x2x_2 using bb

x2=x1b=x1f(x)f(x)x_2 = x_1 - b = x_1 - \frac{f(x)}{f^\prime(x)}

Thus the general formula for improving our guess xnx_n to a better guess xn+1x_{n+1} is

xn+1=xnf(xn)f(xn)x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}

Now lets have some fun! firstly lets see how to compute a\sqrt a
we need to construct a function that is zero at a\sqrt a, lets use f(x)=x2af(x) = x^2 - a
calculus tells us f(x)=2xf^\prime(x) = 2x. we can use our formula now!

xn+1=xx2a2x=xxax2x_{n+1} = x - \frac{x^2 - a}{2x} = x - \frac{x - \frac{a}{x}}{2}

Awesome! lets try this in python

def improve(g, a):
  return g - (g - a/g) / 2

g = 1             # initial guess
g = improve(g, 2) # g = 1.5
g = improve(g, 2) # g = 1.416666666666
g = improve(g, 2) # g = 1.414215686274
g = improve(g, 2) # g = 1.414213562374

After 4 iterations we're already correct to 12 decimal places! newton's method converges quickly, since as we approach the root our linear approximation becomes better and better. since x2x^2 is a line as you zoom in.