New! Sign up for our free email newsletter.
Science News
from research organizations

Significant energy savings using neuromorphic hardware

Date:
May 24, 2022
Source:
Graz University of Technology
Summary:
New research illustrates neuromorphic technology is up to sixteen times more energy-efficient for large deep learning networks than other AI systems.
Share:
FULL STORY

For the first time TU Graz's Institute of Theoretical Computer Science and Intel Labs demonstrated experimentally that a large neural network can process sequences such as sentences while consuming four to sixteen times less energy while running on neuromorphic hardware than non-neuromorphic hardware. The new research based on Intel Labs' Loihi neuromorphic research chip that draws on insights from neuroscience to create chips that function similar to those in the biological brain.

The research was funded by The Human Brain Project (HBP), one of the largest research projects in the world with more than 500 scientists and engineers across Europe studying the human brain. The results of the research are published in the research paper "Memory for AI Applications in Spike-based Neuromorphic Hardware" (DOI 10.1038/s42256-022-00480-w) which in published in Nature Machine Intelligence.

Human brain as a role model

Smart machines and intelligent computers that can autonomously recognize and infer objects and relationships between different objects are the subjects of worldwide artificial intelligence (AI) research. Energy consumption is a major obstacle on the path to a broader application of such AI methods. It is hoped that neuromorphic technology will provide a push in the right direction. Neuromorphic technology is modelled after the human brain, which is highly efficient in using energy. To process information, its hundred billion neurons consume only about 20 watts, not much more energy than an average energy-saving light bulb.

In the research, the group focused on algorithms that work with temporal processes. For example, the system had to answer questions about a previously told story and grasp the relationships between objects or people from the context. The hardware tested consisted of 32 Loihi chips.

Loihi research chip: up to sixteen times more energy-efficient than non-neuromorphic hardware

"Our system is four to sixteen times more energy-efficient than other AI models on conventional hardware," says Philipp Plank, a doctoral student at TU Graz's Institute of Theoretical Computer Science. Plank expects further efficiency gains as these models are migrated to the next generation of Loihi hardware, which significantly improves the performance of chip-to-chip communication.

"Intel's Loihi research chips promise to bring gains in AI, especially by lowering their high energy cost," said Mike Davies, director of Intel's Neuromorphic Computing Lab. "Our work with TU Graz provides more evidence that neuromorphic technology can improve the energy efficiency of today's deep learning workloads by re-thinking their implementation from the perspective of biology."

Mimicking human short-term memory

In their neuromorphic network, the group reproduced a presumed memory mechanism of the brain, as Wolfgang Maass, Philipp Plank's doctoral supervisor at the Institute of Theoretical Computer Science, explains: "Experimental studies have shown that the human brain can store information for a short period of time even without neural activity, namely in so-called 'internal variables' of neurons. Simulations suggest that a fatigue mechanism of a subset of neurons is essential for this short-term memory."

Direct proof is lacking because these internal variables cannot yet be measured, but it does mean that the network only needs to test which neurons are currently fatigued to reconstruct what information it has previously processed. In other words, previous information is stored in the non-activity of neurons, and non-activity consumes the least energy.

Symbiosis of recurrent and feed-forward network

The researchers link two types of deep learning networks for this purpose. Feedback neural networks are responsible for "short-term memory." Many such so-called recurrent modules filter out possible relevant information from the input signal and store it. A feed-forward network then determines which of the relationships found are very important for solving the task at hand. Meaningless relationships are screened out, the neurons only fire in those modules where relevant information has been found. This process ultimately leads to energy savings.

"Recurrent neural structures are expected to provide the greatest gains for applications running on neuromorphic hardware in the future," said Davies. "Neuromorphic hardware like Loihi is uniquely suited to facilitate the fast, sparse and unpredictable patterns of network activity that we observe in the brain and need for the most energy efficient AI applications."

This research was financially supported by Intel and the European Human Brain Project, which connects neuroscience, medicine, and brain-inspired technologies in the EU. For this purpose, the project is creating a permanent digital research infrastructure, EBRAINS. This research work is anchored in the Fields of ExpertiseHuman and Biotechnology and Information, Communication & Computing, two of the five Fields of Expertise of TU Graz.


Story Source:

Materials provided by Graz University of Technology. Original written by Christoph Pelzl. Note: Content may be edited for style and length.


Journal Reference:

  1. Arjun Rao, Philipp Plank, Andreas Wild, Wolfgang Maass. A Long Short-Term Memory for AI Applications in Spike-based Neuromorphic Hardware. Nature Machine Intelligence, 2022; DOI: 10.1038/s42256-022-00480-w

Cite This Page:

Graz University of Technology. "Significant energy savings using neuromorphic hardware." ScienceDaily. ScienceDaily, 24 May 2022. <www.sciencedaily.com/releases/2022/05/220524100612.htm>.
Graz University of Technology. (2022, May 24). Significant energy savings using neuromorphic hardware. ScienceDaily. Retrieved December 22, 2024 from www.sciencedaily.com/releases/2022/05/220524100612.htm
Graz University of Technology. "Significant energy savings using neuromorphic hardware." ScienceDaily. www.sciencedaily.com/releases/2022/05/220524100612.htm (accessed December 22, 2024).

Explore More

from ScienceDaily

RELATED STORIES