Dag Spicer is expecting a special package soon, but it’s not a Black Friday impulse buy. The fist-sized motor, greened by corrosion, is from a historic room-sized computer intended to ape the human brain. It may also point toward artificial intelligence's future.

Spicer is senior curator at the Computer History Museum in Mountain View, California. The motor in the mail is from the Mark 1 Perceptron, built by Cornell researcher Frank Rosenblatt in 1958. Rosenblatt's machine learned to distinguish shapes such as triangles and squares seen through its camera. When shown examples of different shapes, it built “knowledge” using its 512 motors to turn knobs and tune its connections. "It was a major milestone," says Spicer.

Computers today don’t log their experiences---or ours---using analog parts like the Perceptron’s self-turning knobs. They store and crunch data digitally, using the 1s and 0s of binary numbers. But 11 miles away from the Computer History Museum, a Redwood City, California, startup called Mythic is trying to revive analog computing for artificial intelligence. CEO and cofounder Mike Henry says it’s necessary if we’re to get the full benefits of artificial intelligence in compact devices like phones, cameras, and hearing aids.

Mythic's analog chips are designed to run artificial neural networks in small devices. Mythic

Mythic uses analog chips to run artificial neural networks, or deep-learning software, which drive the recent excitement about AI. The technique requires large volumes of mathematical and memory operations that are taxing for computers---and particularly challenging for small devices with limited chips and battery power. It’s why the most powerful AI systems reside on beefy cloud servers. That’s limiting, because some places AI could be useful have privacy, time, or energy constraints that mean handing off data to a distant computer is impractical.

You might say Mythic’s project is an exercise in time travel. “By the time I went to college analog computers were gone,” says Eli Yablonovitch, a professor at University of California Berkeley who got his first degree in 1967. “This brings back something that had been soundly rejected." Analog circuits have long been relegated to certain niches, such as radio signal processing.

Henry says internal tests indicate Mythic chips make it possible to run more powerful neural networks in a compact device than a conventional smartphone chip. "This can help deploy deep learning to billions of devices like robots, cars, drones, and phones," he says.

Henry likes to show the difference his chips could make with a demo in which simulations of his chip and a smartphone chip marketed as tuned for AI run software that spots pedestrians in video from a camera mounted on a car. The chips Mythic has made so far are too small to run a full video processing system. In the demo, Mythic’s chip can spot people from a greater distance, because it doesn’t have to scale down the video to process it. The suggestion is clear: you’ll be more comfortable sharing streets with autonomous vehicles that boast analog inside.

Digital computers work by crunching binary numbers through clockwork-like sequences of arithmetic. Analog computers operate more like a plumbing system, with electrical current in place of water. Electrons flow through a maze of components like amplifiers and resistors that do the work of mathematical operations by changing the current or combining it with others. Measuring the current that emerges from the pipeline reveals the answer.

That approach burns less energy than an equivalent digital device on some tasks because it requires fewer circuits. A Mythic chip can also do all the work of running a neural network without having to tap a device's memory, which can interfere with other functions. The analog approach isn't great for everything, not least because it's more difficult to control noise, which can affect the precision of numbers. But that's not a problem for running neural networks, which are prized for their ability to make sense of noisy data like images or sound. "Analog math is great for neural networks, but I wouldn't balance my check book with it," Henry says.