1916: Claude Elwood Shannon, the father of information theory and the man who coined the term bit, is born.

Shannon was educated at the University of Michigan and the Massachusetts Institute of Technology and spent most of his career working for Bell Labs.

Shannon’s 1938 master’s thesis, A Symbolic Analysis of Relay and Switching Circuits , used Boolean algebra to establish the theoretical basis of modern digital circuits. The paper came out of Shannon’s insight that the binary nature of Boolean logic was analogous to the ones and zeros used by digital circuits.

His paper was widely cited, laying the foundations for modern information theory. It has been called “one of the most significant master’s theses of the 20th century.” Not bad for a 22-year-old kid from a small town in Michigan.

That paper includes the first known use of the term bit to refer to a “binary digit.” Later wags would expand the terminology to include byte (usually an 8-digit binary number) and even nybble (half a byte, or 4 binary digits).

Shannon’s later work, capped by the landmark 1948 A Mathematical Theory of Communication , more fully fleshed out communications theory, defining key concepts and predicting the upper limits of communications rates over telephone lines as well as more modern optical and wireless transmissions. His framework and terminology remain standard to this date.

Shannon’s interests were eclectic and included cryptography (on which he briefly collaborated with Alan Turing during World War II), data and signal processing, and Mendelian genetics. He was an avid tinkerer, creating devices such as a chess-playing machine, a chairlift to carry his three children 600 feet from his house to the lakeside, rocket-powered flying saucers, a Rubik’s-cube–solving machine and juggling machines.

He also built an electromechanical “mouse” that could navigate arbitrary mazes, learning the correct path out as it went. The device has been called one of the first learning machines ever built.

A gadget he kept on his desk, called the “Ultimate Machine,” consisted of nothing but a box with a switch on it. When the switch was flipped on, the lid of the box opened and a mechanical hand emerged to flip the switch off, after which it retreated back into the box. The idea was inspired by Marvin Minsky, and perhaps also Arthur C. Clarke.

Shannon was sometimes known to ride a unicycle down the halls of Bell Labs — while juggling.

Shannon’s theories weren’t limited to the abstract: He also used mathematics to get an edge in gambling and the stock market, using a principle developed by another mathematician — the Kelly criterion. Together with MIT mathematician Edward O. Thorp, Shannon traveled to Las Vegas, making a killing at the blackjack tables (and at least once using a “wearable” computer he developed to help compute the odds).

Later, Thorp would use these theories to found a hedge fund, Princeton-Newport Partners, making him one of the first successful “quants” on Wall Street.

Shannon died in 2001.

Sources: Wikipedia, Encyclopedia Britannica, MIT, Bell Labs, NYU

Photo: Claude Shannon invented a clever electromechanical mouse, which he called Theseus.It was one of the earliest attempts to build a machine that could learn and one of the first experiments in artificial intelligence.

Courtesy Bell Labs

See Also:

April 30, 1897: J.J. Thomson Announces the Electron … Sort Of

April 30, 1945: New Generation U-Boat Too Little, Too Late

March 25, 1916: Ishi Dies, a World Ends

Sept. 15, 1916: All Disquiet on the Western Front



