Did you know that there are numbers that cannot be computed by any computer program? It is weird, but true.

And by number, I mean just an ordinary real number. As a perhaps unnecessarily simple example, the result of the division 1/7 looks like this:

0.1428571428571428571428571428571414285714285714285714285714285714…

We can easily implement a program that prints this number. The decimal expansion of 1/7 is infinite, so the program will have to run in an infinite loop to print the “whole” number. Here is a C# implementation:

static void Main() { Console .Write( "0." ); while ( true ) Console .Write( "142857" ); }

It is a bit of a philosophical question whether this program is really “computing” 1/7 or whether it has the answer hard-coded. You can certainly write a program that will compute the answer more legitimately by long division. The part that really matters is that there is some program that prints the infinite decimal expansion of 1/7, which makes 1/7 a computable number.

Similarly, you can write programs that will print the infinite decimal expansion of any rational number, sqrt(2), any algebraic number, π, e, and just about any other number that you can describe. For example, to compute π, you could use one of the known approximation algorithms. You would compute π to greater and greater accuracy, and print more and more digits to the screen. As a side note, we are assuming that the computer that will execute your program has an unlimited amount of memory. While not very realistic, this abstraction is analogous to the famous Turing machine, and extremely useful to understand certain deep truths about computation.

Interestingly, there are numbers that are non-computable. A number is non-computable if there is no program that prints its infinite decimal expansion (adding trailing zeros if a finite expansion is possible). How do we know that there are such numbers? The key insight is that there are more real numbers than there are C# programs. That is pretty surprising, given that both the number of real numbers and the number of C# programs are infinite. Nevertheless, it is true.

C# programs are countable. That means that we can assign a different positive integer to each program. The shortest valid C# program will be 1, the next shortest will be 2, and so forth. If there are multiple valid programs of the same length, we will sort the programs lexicographically and assign integers in that order. By the way, there are many different ways in which to define a “valid program”. One approach is to say that to be valid, the program must compile, run without exceptions, and print an infinite sequence of digits, separated at exactly one place with a decimal point.

Unlike C# programs, real numbers are uncountable. There is no way to assign a different integer to each real number… there are just too many real numbers! This fact was proved by Georg Cantor in a couple different ways, the most famous of which is the diagonalization argument.

Not only do non-computable numbers exist, but in fact they are vastly more abundant than computable numbers. Many, many real numbers are simply infinite sequences of seemingly random digits, with no pattern or special property. But, even though there are so many uncountable numbers, their examples tend to be weird and a little strenuous to explain.

As one such example, consider a number whose part before the decimal point is 0. We choose i-th digit after the decimal point to be different from the digit in the same position in the number printed by program i (by “program i”, I mean the program associated with integer i, as described earlier in the article). So, each digit after the decimal point guarantees that the constructed number will differ from the number printed by a particular program. This demonstrates that the constructed number will be different from any number printed by a computer program! By the way, this construction is basically Cantor’s diagonalization argument, only recast in a different terminology.

Theoretical foundations of computer science, which underlie everything we do as programmers, are nothing short of amazing. If you would like to read more about computability and related concepts, check out Charles Petzold’s book, The Annotated Turing. The original Alan Turing’s paper that introduces computable numbers is available here, but Petzold’s book is a lot easier to read.

[Update] See the forum on the reddit page of this article for additional in-depth discussion of this topic.

More articles:

Skip list are fascinating!

Data structure zoo: ordered set

Quicksort killer

Tags: Cool