Intel's neuromorphic chip. Intel Labs

This week, Intel will show off a chip that learns to recognize objects in pictures captured by a webcam. Nothing fancy about that, except that the chip uses about a thousandth as much power as a conventional processor.

The device, called Loihi, which Intel is putting through its paces at the Consumer Electronics Show (CES) in Las Vegas, is a neuromorphic chip—one that mimics, in a simplified way, the functioning of neurons and synapses in the brain.

The best AI algorithms already use brain-like programs called simulated neural networks, which rely on parallel processing to recognize patterns in data—including objects in images and words in speech. Neuromorphic chips take this idea further by etching the workings of neural networks into silicon. They are less flexible and powerful than the best general-purpose chips, but being specialized to their task makes them very energy efficient, and thus ideal for mobile devices, vehicles, and industrial equipment.

The idea of neuromorphic chips has been around for decades, but the technology may finally be ready to find its commercial niche. Across the tech industry, progress in AI has inspired new research into hardware capable of using machine-learning algorithms more efficiently.

Chris Eliasmith, a professor who studies neuroscience and computer architectures at the University of Waterloo in Canada, says the biggest challenge with neuromorphic chips in the past has been scaling them up. “This is one thing I really like about Intel entering the space,” he says. “They have the resources to push things ahead quickly.”