If you’re asked to guess the emotion of someone in a video clip, neurons in your brain will exchange information in a flurry of electronic spikes. When researchers at Intel recently put a similar challenge to the prototype of their new chip, Loihi, it tried to solve the problem with thousands of spiking silicon “neurons” of its own. Like your neurons, they can adjust the connections between themselves to adapt to new tasks.

Intel’s new design, named after a submarine volcano in Hawaii, still isn’t much like a real brain. But it’s very different from a conventional processor. The company says this approach could one day make cars, cameras, and robots smarter without having to rely on an internet connection to the cloud. Cutting the cord removes the need to wait for data to traverse the internet and has privacy benefits.

Intel says tests indicate its brain-inspired, or neuromorphic, design can do things like interpret video using as little as one-thousandth of the energy of a conventional chip. That, and Loihi’s ability to learn as it encounters new data, are seen as pointing to a future where machines can better hold their own in the always-changing real world. “We’re trying to get better at understanding things that are happening in a natural environment,” says Michael Mayberry, managing director of Intel’s research arm.

A schematic of Intel's experimental new chip design. Intel

Loihi is still a research project. But Mayberry says the first full version of the chip, with 130,000 neurons and the size of a pinkie fingernail, will be fabricated in November. Some academic and research institutions will get to try it in 2018. Intel has made two smaller prototypes, and is now testing the full design. Mayberry estimates that if Loihi sparks interest, it would take two years or more to get to market.

Loihi is Intel’s latest effort to turn the current vogue for AI into a new growth engine for the company. Last year Intel acquired two startups working on chips to power machine learning in the cloud and for computer vision, Nervana and Movidius. This March, Intel bought Israel’s Mobileye, which makes cameras and chips for automated driving.

Intel needs new growth businesses. The PC market it has long dominated is stalling, and the company has given up on breaking into the market for processors inside mobile devices.

The AI startups Intel acquired created chips to accelerate artificial neural networks. The technique underpins advances such as Google’s Go champion bot AlphaGo, and involves loosely simulating neurons that work together to filter data. But Intel’s existing technology—and that in AI chips from Google, Microsoft, and Apple—powers neural networks using conventional chip designs.

Intel’s Loihi is different because its crude analogs of neurons are burned into hardware, and its design differs fundamentally from the computer chips the world runs on today. In conventional chips, data shuttles back and forth between a processor and separate memory. Loihi’s “neurons” and the adjustable connections between them function as both processor and memory, saving time and energy required to shuffle data around. The connections—analogous to synapses—between neurons can adjust to patterns in their activity over time, mimicking a learning mechanism seen in real brains. Tests of this ability have included showing the chip videos of people performing movements such as bicep curls, and challenging it to recognize the same motion in fresh video clips.

Intel is not the first company to design a chip using pointers from neuroscience. IBM built two generations of its own neuromorphic processor, although that chip, unlike Intel’s, cannot learn from incoming data. It began life under a grant from Pentagon research agency DARPA, which hopes neuromorphic hardware could do things like automatically analyze drone video footage on the battlefield. IBM has struck deals with two labs to build research systems with its chip, but not announced broad commercial availability.

Some leading AI researchers, including Facebook’s Yann LeCun, have expressed skepticism about neuromorphic chips, noting that spiking silicon neurons have not yet proved as powerful or flexible as machine-learning software running on conventional chips.

Intel’s Mayberry says the way his Loihi chip learns will make it more adaptable than previous systems. His project may also benefit from favorable timing.

The recent rush to make and use chips designed to support AI software suggests companies are no longer happy to just rely on improvements to conventional chip technology. “All these companies are realizing it makes sense to do specialized designs,” says Tushar Krishna, an assistant professor at Georgia Tech. He points to the demise of Moore’s Law, the once-reliable trend of ever increasing CPU performance as another reason companies are becoming more open minded about new ideas. Brain-inspired chips remain far from proven, but may be poised to get a more serious look.