AMHERST, Mass. – Opening the way for advances in applications such as natural language understanding, machine translation, speech recognition and video surveillance, a team of researchers headed by Qiangfei Xia and Joshua Yang, electrical and computer engineers at the University of Massachusetts Amherst, says it can use memristor crossbar arrays to overcome a key bottleneck in traditional computing architecture. The findings are published in Nature Machine Intelligence.

The researchers say a memristor is a two-terminal “memory resistor” that performs computation at the same location where information is stored. This feature removes the need for traditional computers to transfer data between two sites, leading to a much better energy-speed efficiency. A memristor crossbar is a matrix of the tiny switches.

“Recent breakthroughs in recurrent deep neural networks with long short-term memory (LSTM) units have led to major advances in artificial intelligence,” the researchers say. However, state-of-the-art LSTM models with significantly increased complexity and a large number of parameters have a bottleneck in computing power resulting from both limited memory capacity and limited data communication capacity.

A solution to this LSTM blockage can be implemented with a memristor crossbar array, which has a small circuit footprint, can store a large number of parameters, and offers in-memory computing capability that contributes to circumventing what is known as the von Neumann bottleneck. “We illustrate the capability of our crossbar system as a core component in solving real-world problems, and show that memristor LSTM is a promising low-power and low-latency hardware platform for edge inference.”

The von Neumann bottleneck refers to the limits on the amount of data that can be transferred and energy efficiency in a computer built using the von Neumann architecture, in which the data processing and memory units are physically separated with a single common bus in between. John von Neumann was a 20th century mathematician, scientist, and computer science pioneer who in 1945 proposed the computer architecture which is still the basis for digital computers today.

“The memristor crossbar implementation of an LSTM,” say the authors, “to the best of our knowledge, has yet to be demonstrated, primarily because of the relative scarcity of large memristor arrays. In this work, we demonstrate our experimental implementation of a core part of LSTM networks in memristor crossbar arrays.”

As a demonstration, the authors applied the memristor-based LSTM in predicting the number of airline passengers based on data from past years, and in recognizing a person according to the way she or he walks. This is important in identifying a person when the face is camouflaged or facial recognition is technically difficult. “This work shows that the LSTM networks built in memristor crossbar arrays represent a promising alternative computing paradigm with high-speed energy efficiency.”

The Nature Machine Intelligence paper represents a collaboration among 17 researchers at the UMass Amherst, Hewlett Packard Labs in Palo Alto, Calif., and the Air Force Research Laboratory Information Directorate in Rome, N.Y.