Intel Corp. is releasing an experimental research system for neuromorphic computing, a cutting-edge method that simulates the way human brains work to perform computations faster, using significantly less energy.

The system, called Pohoiki Springs, will be made available this month over the cloud to members of the Intel Neuromorphic Research Community, which includes academic researchers, government labs and about a dozen companies such as Accenture PLC and Airbus SE.

Others, including International Business Machines Corp., are also researching the technique.

Neuromorphic chips are expected to be the predominant computing architecture for new, advanced forms of artificial-intelligence deployments by 2025, according to technology research firm Gartner Inc. By that year, Gartner predicts, the technology is expected to displace graphics processing units, one of the main computer chips used for AI systems, especially neural networks. Neural networks are used in speech recognition and understanding, as well as computer vision.

Intel’s neuromorphic research chips. Photo: Tim Herman / Intel Corp.

With neuromorphic computing, it is possible to train machine-learning models using a fraction of the data it takes to train them on traditional computing hardware. That means the models learn similarly to the way human babies learn, by seeing an image or toy once and being able to recognize it forever, said Mike Davies, director of Intel’s Neuromorphic Computing Lab.

The models can also learn from the data, nearly instantaneously, ultimately making predictions that could be more accurate than those made by traditional machine-learning models, he said. “It’s going to make some computations [possible] that are intractable today,” because they require a lot of energy or too much time to calculate, Mr. Davies said.

In the case of a widespread power outage, for example, neuromorphic computing could automatically help identify certain areas where power will be needed most, Mr. Davies said. It could also help consumers more accurately find items that are similar to or match photos of specific products, he said.

Unlike in traditional machines, in the Pohoiki Springs system, the memory and computing elements are intertwined rather than separate, Mr. Davies said. That minimizes the distance that data has to travel, because in traditional computing architectures, data has to flow back and forth between memory and computing, he said.

Intel researchers recently used a single neuromorphic research chip to train an AI system to recognize hazardous odors using one training sample per odor, compared to the 3,000 samples required in state-of-the-art deep-learning methods—therefore requiring a fraction of the energy.

In the experiment, the machine-learning model was able to detect different smells in a chemical sensor, such as ammonia, acetone and methane, even when they were masked by different scents. Such odors can suggest the presence of explosives and narcotics.

The Pohoiki Springs system comprises about 770 such neuromorphic research chips inside a chassis the size of five standard servers. It has a computational capacity of about 100 million neurons, roughly similar to the brain of a mole-rat, Mr. Davies said.

One of the major benefits of neuromorphic computing is the potential for it to perform AI-based calculations using much less energy, said Edy Liongosari, chief research scientist at Accenture Labs, the consulting company’s technology research and development division.

Energy consumption is an impediment to large-scale AI deployments. Developing a single AI model, for example, can have a carbon footprint equivalent to the lifetime emissions of five average U.S. cars, according to researchers at the University of Massachusetts, Amherst.

Accenture Labs has been working with Intel’s neuromorphic computing researchers since 2018 to see how the technology could benefit AI algorithms that are used in internet-connected devices, such as security cameras that are constantly detecting motion. Neuromorphic chips could eventually be embedded in cameras.

“Power is such a precious commodity in some of these use cases,” Mr. Liongosari said.

Such cameras are constantly analyzing massive amounts of data, and therefore using energy, in order to identify anomalies such as intrusions, he said. Neuromorphic computing could help machine-learning algorithms recognize an intrusion using far less training data.

Write to Sara Castellanos at sara.castellanos@wsj.com