New! Sign up for our free email newsletter.
Science News
from research organizations

Silicon 'neurons' may add a new dimension to computer processors

Energy constraints lead to novel ways of efficient, at-a-distance communication

Date:
June 4, 2020
Source:
Washington University in St. Louis
Summary:
Research shows that energy constraints on a system, coupled with an intrinsic property of systems, push silicon neurons to create a dynamic, at-a-distance communication that is more robust and efficient than traditional computer processors. And it may teach us something about biological brains.
Share:
FULL STORY

When it fires, a neuron consumes significantly more energy than an equivalent computer operation. And yet, a network of coupled neurons can continuously learn, sense and perform complex tasks at energy levels that are currently unattainable for even state-of-the-art processors.

What does a neuron do to save energy that a contemporary computer processing unit doesn't?

Computer modelling by researchers at Washington University in St. Louis' McKelvey School of Engineering may provide an answer. Using simulated silicon "neurons," they found that energy constraints on a system, coupled with the intrinsic property neurons have to move to the lowest-energy configuration, leads to a dynamic, at-a-distance communication protocol that is both more robust and more energy-efficient than traditional computer processors.

The research, from the lab of Shantanu Chakrabartty, the Clifford W. Murphy Professor in the Preston M. Green Department of Systems & Electrical Engineering, was published last month in the journal Frontiers in Neuroscience.

It's a case of doing more with less.

Ahana Gangopadhyay, a doctoral student in Chakrabartty's lab and a lead author on the paper, has been investigating computer models to study the energy constraints on silicon neurons -- artificially created neurons, connected by wires, that show the same dynamics and behavior as the neurons in our brains.

Like biological neurons, their silicon counterparts also depend on specific electrical conditions to fire, or spike. These spikes are the basis of neuronal communication, zipping back and forth, carrying information from neuron to neuron.

The researchers first looked at the energy constraints on a single neuron. Then a pair. Then, they added more. "We found there's a way to couple them where you can use some of these energy constraints, themselves, to create a virtual communication channel," Chakrabartty said.

A group of neurons operates under a common energy constraint. So, when a single neuron spikes, it necessarily affects the available energy -- not just for the neurons it's directly connected to, but for all others operating under the same energy constraint.

Spiking neurons thus create perturbations in the system, allowing each neuron to "know" which others are spiking, which are responding, and so on. It's as if the neurons were all embedded in a rubber sheet; a single ripple, caused by a spike, would affect them all. And like all physical processes, systems of silicon neurons tend to self-optimize to their least-energetic states while also being affected by the other neurons in the network.

These constraints come together to form a kind of secondary communication network, where additional information can be communicated through the dynamic but synchronized topology of spikes. It's like the rubber sheet vibrating in a synchronized rhythm in response to multiple spikes.

This topology carries with it information that is communicated, not just to the neurons that are physically connected, but to all neurons under the same energy constraint, including ones that are not physically connected.

Under the pressure of these constraints, Chakrabartty said, "They learn to form a network on the fly."

This makes for much more efficient communication than traditional computer processors, which lose most of their energy in the process of linear communication, where neuron A must first send a signal through B in order to communicate with C.

Using these silicon neurons for computer processors gives the best efficiency-to-processing speed tradeoff, Chakrabartty said. It will allow hardware designers to create systems to take advantage of this secondary network, computing not just linearly, but with the ability to perform additional computing on this secondary network of spikes.

The immediate next steps, however, are to create a simulator that can emulate billions of neurons. Then researchers will begin the process of building a physical chip.


Story Source:

Materials provided by Washington University in St. Louis. Original written by Brandie Jefferson. Note: Content may be edited for style and length.


Journal Reference:

  1. Ahana Gangopadhyay, Darshit Mehta, Shantanu Chakrabartty. A Spiking Neuron and Population Model Based on the Growth Transform Dynamical System. Frontiers in Neuroscience, 2020; 14 DOI: 10.3389/fnins.2020.00425

Cite This Page:

Washington University in St. Louis. "Silicon 'neurons' may add a new dimension to computer processors." ScienceDaily. ScienceDaily, 4 June 2020. <www.sciencedaily.com/releases/2020/06/200604152125.htm>.
Washington University in St. Louis. (2020, June 4). Silicon 'neurons' may add a new dimension to computer processors. ScienceDaily. Retrieved December 20, 2024 from www.sciencedaily.com/releases/2020/06/200604152125.htm
Washington University in St. Louis. "Silicon 'neurons' may add a new dimension to computer processors." ScienceDaily. www.sciencedaily.com/releases/2020/06/200604152125.htm (accessed December 20, 2024).

Explore More

from ScienceDaily

RELATED STORIES