New! Sign up for our free email newsletter.
Science News
from research organizations

Making big data processing more energy efficient using magnetic circuits

Date:
April 13, 2020
Source:
University of Texas at Austin
Summary:
New research finds that magnetic wires, spaced a certain way, can lead to a 20-30x reduction in the amount of energy needed to run neural network training algorithms.
Share:
FULL STORY

The rapid progression of technology has led to a huge increase in energy usage to process the massive troves of data generated by devices. But researchers in the Cockrell School of Engineering at The University of Texas at Austin have found a way to make the new generation of smart computers more energy efficient.

Traditionally, silicon chips have formed the building blocks of the infrastructure that powers computers. But this research uses magnetic components instead of silicon and discovers new information about how the physics of the magnetic components can cut energy costs and requirements of training algorithms -- neural networks that can think like humans and do things like recognize images and patterns.

"Right now, the methods for training your neural networks are very energy-intensive," said Jean Anne Incorvia, an assistant professor in the Cockrell School's Department of Electrical and Computer Engineering. "What our work can do is help reduce the training effort and energy costs."

The researchers' findings were published this week in IOP Nanotechnology. Incorvia led the study with first author and second-year graduate student Can Cui. Incorvia and Cui discovered that spacing magnetic nanowires, acting as artificial neurons, in certain ways naturally increases the ability for the artificial neurons to compete against each other, with the most activated ones winning out. Achieving this effect, known as "lateral inhibition," traditionally requires extra circuitry within computers, which increases costs and takes more energy and space.

Incorvia said their method provides an energy reduction of 20 to 30 times the amount used by a standard back-propagation algorithm when performing the same learning tasks.

The same way human brains contain neurons, new-era computers have artificial versions of these integral nerve cells. Lateral inhibition occurs when the neurons firing the fastest are able to prevent slower neurons from firing. In computing, this cuts down on energy use in processing data.

Incorvia explains that the way computers operate is fundamentally changing. A major trend is the concept of neuromorphic computing, which is essentially designing computers to think like human brains. Instead of processing tasks one at a time, these smarter devices are meant to analyze huge amounts of data simultaneously. These innovations have powered the revolution in machine learning and artificial intelligence that has dominated the technology landscape in recent years.

This research focused on interactions between two magnetic neurons and initial results on interactions of multiple neurons. The next step involves applying the findings to larger sets of multiple neurons as well as experimental verification of their findings.

The research was funded by a National Science Foundation CAREER Award and Sandia National Laboratories, with resources from UT's Texas Advanced Computing Center.


Story Source:

Materials provided by University of Texas at Austin. Note: Content may be edited for style and length.


Journal Reference:

  1. Can Cui, Otitoaleke Gideon Akinola, Naimul Hassan, Christopher Bennett, Matthew Marinella, Joseph Friedman, Jean Anne Currivan Incorvia. Maximized Lateral Inhibition in Paired Magnetic Domain Wall Racetracks for Neuromorphic Computing. Nanotechnology, 2020; DOI: 10.1088/1361-6528/ab86e8

Cite This Page:

University of Texas at Austin. "Making big data processing more energy efficient using magnetic circuits." ScienceDaily. ScienceDaily, 13 April 2020. <www.sciencedaily.com/releases/2020/04/200413132812.htm>.
University of Texas at Austin. (2020, April 13). Making big data processing more energy efficient using magnetic circuits. ScienceDaily. Retrieved December 25, 2024 from www.sciencedaily.com/releases/2020/04/200413132812.htm
University of Texas at Austin. "Making big data processing more energy efficient using magnetic circuits." ScienceDaily. www.sciencedaily.com/releases/2020/04/200413132812.htm (accessed December 25, 2024).

Explore More

from ScienceDaily

RELATED STORIES