A Nerd With Free Time, *Le Gasp*

So today is 1/6/18 if you write your dates month/day/year, which is a pretty accurate approximation of phi, or the golden ratio, for a calendar date. Since I enjoy numerical analysis, I figured, why not celebrate by approximating phi? And as my new years resolution is to exposit and engage more with the greater mathematics community, I’m writing about it here.

The naive way is to find a large pair of consecutive Fibonacci numbers and take their ratio. So that is where we will start. The “dumbest” way of doing this is just deciding we want the xth consecutive Fibonacci pair, adding until we get their and taking the ratio. This algorithm runs very quickly (on the order of 0.0001 seconds) and gives 5 to 10 correct digits of phi very quickly.

Now let’s try a smarter algorithm to actually set some sort of benchmark. To block out the algorithm in pseudocode:

input -> tolerance_variable

x0 = 1, x1 = 1, x2 = 2

difference = abs[(x2/x1) – (x1/x0)]

while (difference >= tolerance_variable):

{x3 = x2 + x1

x0 = x1, x1 = x2, x2 = x3}

return x2/x1

This algorithm will let us choose an arbitrary level of accuracy and then will run until the difference between successive approximations is less than that. This means our output error is at least as small as our input. Running this algorithm for a variety of tolerances gives us the following plot:

We see this algorithm is approximately linear with respect to the desired level of accuracy. My suspicion is that this isn’t a true linear algorithm as division is O(n^2) in the number of digits, n. The low desired precision on this portion of the curve is such that the cost of each division is smaller and fewer total divisions are required as we reach the desired approximation quickly. I’d love to stress test this a bit but python keeps whining about “can’t convert infinity to an int”. Wimp.

What Else Do We Know About Phi?

Maybe they’re a better than linear algorithm. We could think about other characterizations of Phi. For example, we might consider the continued fraction form. (Images courtesy of Wikipedia.org)

The problem is that trying to solve this is equivalent to our Fibonacci method. Try truncating the continued fraction somewhere and you’ll end up with a ratio of consecutive Fibonacci numbers, regardless of where you choose to truncate it. (Are undergrads allowed to leave exercises to the reader?) We wouldn’t expect this to be a better method

We also know that φ^2 = φ + 1, so finding the positive root would give us φ. Using the quadratic formula we get:

So, to find the square root of 5, we use the bisection method to find the positive root of x^2 – 5 = 0. The Wikipedia write-up of the bisection method is good enough that I’ll just link the interested there. For our purposes, the method consists of drawing an interval around where we think the root is and cutting it in half based off how good an approximation the middle point is. The bisection method is relatively easy to code compared to other root-finding methods and is…pretty terrible. Take a look:

In the words of my wise numerical analysis professor, “That’s gross.” It’s not actually that terrible as the plot is still linear. The larger y-intercept of the plot likely has to do with slight differences in my implementation of the arbitrary precision code, the fact that I’ve started watching Numberphile videos in the background (I’ve been at this a while, guys) or the ire of my small toaster-like computer.

The linearity of the bisection method actually makes intuitive sense, as each step should be roughly halving the error from the previous step. I should have thought about that before taking the time to code it up and compute phi with it, eh? It’s not terrible, but it still ain’t great. It’s definitely not the improvement I’d hoped for.

Newton Saves The Day…Again

So, it’s late where I am and I didn’t expect the coding, the writing or the brain-wracking to take quite this long. As a last ditch effort to send Phi day off on a good note, I’m going to resort to Newton’s method.

Growing up, I remember my dad would check my algebra homework questions (stuff like cube roots, polynomial roots, etc) with Newton’s method. In a spreadsheet. In Excel. *Cringe* As an already budding mathematician, this bothered me.

Childhood frustrations aside, Newton’s method does have a lot going for it. If our function is differentiable and convex (hooray for polynomials), then Newton’s method will converge quadratically (read “better than anything else I’ve tried today” or “very, very fast”). Best of all, I already had a version of it coded and merely needed to plug in the desired polynomial. For the wiki page on Newton’s method.

Rather than trying to solve for the value of the square root of 5, I elected to find the positive root of the polynomial φ^2 – (φ + 1) directly. Finding root 5 and then substituting into that formula for phi would likely be somewhat faster, but Newton’s method is extremely fast already and it seems largely unnecessary at the errors we’re benchmarking with. Here’s the benchmark:

And it is ridiculously fast. The apparent constant speed of the algorithm is a function of the fact that the error of the next step is the square of the error from the previous step. So if your initial error is 2^-1, the nth step will have error (2^-1)^2n, which means even for 100 decimal places of accuracy, the algorithm only need iterate at most 10 or 20 times.

While we could have quickly arrived at a satisfactory approximation of phi with either of the other two methods in a relatively short amount of time, it is fun to see how efficient a method you can create.

Things I Didn’t Try

If you want to mess around with different ways of approximating the golden ratio (yay for pop science names), Wikipedia has a nice list of alternative definitions of phi, including infinitely nested radicals, infinite series and so on.

If you can think of faster ways to compute accurate approximations of phi, please let me know.

301 digits of φ

1.6180339887498948482045868343656381177203091798057628621354486227052604628

18902449707207204189391137484754088075386891752126633862223536931793180060

76672635443338908659593958290563832266131992829026788067520876689250171169

62070322210432162695486262963136144381497587012203408058879544547492461856

953649…

These were calculated with the Newton Method benchmarked above. It took 10 iterations of the method to generate this many digits which completed in 0.00476908683777 seconds. Hope you enjoyed this as much as I did.