Turing, Alan Mathison (1912–1954)

Alan Turing was an English mathematician and computer scientist considered to be one of the fathers of modern digital computing. At an early age, Turing showed signs of the genius and eccentricity that were to become hallmarks of his adult personality. He taught himself to read in three weeks and made a habit of stopping at street corners to read the serial numbers of traffic lights. Later, he became a near-Olympic class runner and ran long distances with an alarm clock tied to his waist to time himself.

At Cambridge, Turing studied under G. H. Hardy and got involved with problems that David Hilbert and Kurt Gödel had proposed to do with completeness and decidability in mathematics. In 1936, he introduced the idea of what became known as Turing machines – formal devices capable of solving any conceivable mathematical problem that could be represented by an algorithm. However, the Turing machine was only a theoretical possibility at that time and not a working implementation. It would remain for later researchers to solve the various practical difficulties required to make the computer a reality. Turing also showed that there were mathematical problems that a Turing machine could never solve. One of these is the halting problem. While his proof was published after that of Alonzo Church, Turing's work is more accessible and intuitive.

During World War II, Turing was a major player at Bletchley Park, near present-day Milton Keynes (a town built after the War) in the successful efforts to crack the Nazi Enigma ciphers. While serving at Bletchley Park (1939-1944), he stayed at the Crown Inn, Shenley Brook End, and somewhere near here he buried two silver bars, carefully recording the site with respect to local landmarks. When he returned to recover them, the area had been rebuilt. Despite several attempts with metal detectors, he never recovered them and no one else is known to have found them. The Crown is now a private house and the area where he buried the bars is a housing estate.

Turing's interest in computing continued after the War, when he worked at the National Physical Laboratory on the development of a stored-program computer (the ACE or Automatic Computing Engine). In 1948 he moved to the University of Manchester, where the first stored program digital computer ran later that year. In 1950, in the article "Computing Machinery and Intelligence," Turing tackled the problem of artificial intelligence, and proposed an experiment now known as the Turing test.

In 1952 his lover helped a compatriot to break into Turing's house and commit larceny. Turing went to the police to report the crime. As a result of the police investigation, he was charged with homosexuality (then a crime), offered no defense, and was convicted. Following the well-publicized trial, he was given a choice between incarceration and libido-reducing hormone injections. He chose the latter, which lasted for a year and had side effects including the development of breasts during that period. In 1954 he died of poisoning after eating a cyanide-laced apple. Most (though not his mother) believed that his death was intentional, and the death was ruled a suicide. According to one urban legend the Apple company's logo is symbolic of this event: an apple with two bites (or possibly bytes) out of it and rainbow colors that code for homosexuality.

Turing machine

A Turing machine is an abstract model of computer execution and storage introduced in 1936 by Turing to give a mathematically precise definition of algorithm (see Figure 2). A Turing machine can be thought of as a black box that carries out a calculation of some kind on an input number. If the calculation reaches a conclusion, or halts, then an output number is returned. Otherwise, the machine theoretically carries on forever. There are an infinite number of Turing machines, as there are an infinite number of calculations that can be done with a finite list of rules. A Turing machine that can simulate any other Turing machine is called a. The concept of Turing machines is still widely used in theoretical computer science, especially in complexity theory and the theory of computation.

Reference

1. Hodges, Andrew, and Hofstadter, Douglas. Alan Turing: The Enigma. New York: Walker & Co, 2000.