In the past sixty years or so, computers have migrated from room-size megaboxes to desktops to laptops to our pockets. But the real history of machine-assisted human computation (“computer” originally referred to the person, not the machine) goes back even further.

First in the historical record was the abacus, helping the ancient technorati gain an edge over trading partners still counting cows and amphorae by hand. The oldest known complex computing device, called the Antikythera mechanism, dates back to 87 B.C; it’s surmised the Greeks used this gear-operated contraption (found in a shipwreck in the Aegean Sea early in the 20th century, though its significance wasn’t realized until 2006) to calculate astronomical positions and help them navigate through the seas. Computing took another leap in 1843, when English mathematician Ada Lovelace wrote the first computer algorithm, in collaboration with Charles Babbage, who devised a theory of the first programmable computer. But the modern computing-machine era began with Alan Turing’s conception of the Turing Machine, and three Bell Labs scientists invention of the transistor, which made modern-style computing possible, and landed them the 1956 Nobel Prize in Physics. For decades, computing technology was exclusive to the government and the military; later, academic institutions came online, and Steve Wozniak built the circuit board for Apple-1, making home computing practicable. On the connectivity side, Tim Berners-Lee created the World Wide Web, and Marc Andreessen built a browser, and that’s how we came to live in a world where our glasses can tell us what we’re looking at. With wearable computers, embeddable chips, smart appliances, and other advances in progress and on the horizon, the journey towards building smarter, faster and more capable computers is clearly just beginning.

Infographic by Julie Rossman.