Research grasps how brain plans gripping motion
Could apply to brain-computer interfaces for neural prosthetics
- Date:
- July 28, 2015
- Source:
- Brown University
- Summary:
- A new study significantly advances neuroscientists' understanding of how a region of the brain formulates plans for the hand to grip an object. The findings could lead to direct application to improving brain-computer interface control over robotic arms and hands.
- Share:
With the results of a new study, neuroscientists have a firmer grasp on the way the brain formulates commands for the hand to grip an object. The advance could lead to improvements in future brain-computer interfaces that provide people with severe paralysis a means to control robotic arms and hands using their thoughts.
The key finding of a research team based at Brown University is that neurons in the area of the brain responsible for planning grasping motions retain information about the object to be gripped as they make their movement plan. The collective neural activity therefore looks different when executing the same grip on one object versus another. This may help the brain design unique patterns when similar actions are performed in different environments.
For designers of brain-computer interfaces, whose goal is to translate neural patterns into commands for a prosthetic device, it may be important to know that the emerging plan to execute a "power grip" style, for example, may look different when the object is a hammer versus a soda can. Because the information is distributed across many neurons in a local network, it is possible to have many of these special object-action representations together.
"Many groups have looked at encoding of different grips and different hand positions," said lead and corresponding author Carlos Vargas-Irwin, an investigator in the lab of senior author John Donoghue, the Henry Merritt Wriston Professor of Neuroscience and Engineering. "Typically what's studied is the relationship between a single object and a grip associated with it. What had not been done before is to investigate how the brain can formulate different grips on the same object or the same grip on different objects."
When his team did that, they found that the brain has many ways to formulate a grip command, and those seem to be influenced by what it's gripping.
"You can have the same movement resulting from very different activity patterns within the context of different objects," Vargas-Irwin said. "If we are trying to build a [brain-computer interface] decoder we need to take into account the bigger context of what the target of the movement is."
Probing for patterns
The research team made its findings by recording and analyzing the neural activity in the ventral premotor cortex of three trained rhesus macaques as they participated in a series of grip tasks. Over about a five-second span, the researchers would present one of two different objects. Then they'd show a red or yellow light to signal which of two different grips to use for each, and then flash a green light to signal that the grip should begin. After analysis, the researchers were able to observe how the patterns of neural activity were changing at each stage of each task.
Vargas-Irwin used an analysis technique he developed, called SSIMS, that can accurately detect patterns of activity in collections of neurons without relying on any assumptions about what the brain is trying to do. The patterns can be distinguished based on differences in their activity and can cluster together based on their similarities without the researchers imposing their own view of events going on in the task.
What the analysis showed is that neurons in the ventral premotor cortex follow patterns that differentiate objects and actions. They began to show distinct, identifiable patterns of activity as soon as the object was presented but the animal knew how it was supposed to grasp that object. By the time grips were actually made, the patterns had become so distinct that all four object-grip combinations could be distinctly identified with about 95 percent accuracy.
"We just look at the neural activity patterns in and of themselves and the relationships between them; we can quantify their relative similarity and group them without any knowledge of the what the kinematics are," Vargas-Irwin said. "In this particular experiment we see that we can subdivide the neural activity patterns into groups that correspond to the basic grips and objects."
This was somewhat surprising, Donoghue added, because this "motor" part of the brain is thought to be near the end stages of making a movement, well after sensory processing like object discrimination has been completed. The new findings suggest that this isn't so.
Meaning and more work
The results of the study demonstrate that objects have a significant effect on the evolution of the grip plan. That the brain can produce a variety of activity patterns and still arrive at an appropriate grip plan suggests the brian is flexible enough to handle a wide variety of object contexts and can do so with a local network of neurons.
It's also apparent in the study that the plan to grip an object evolves well in advance of actual execution. Early interpretation of grip planning, including accounting for the distinctive form that plans take in the context of different object, could allow a brain computer interface decoder to get a motion command to a prosthesis more quickly and accurately with information about what is to be gripped, Vargas-Irwin said.
Vargas-Irwin and his colleagues are continuing with experiments to determine how well the findings can be generalized -- to a wider variety of objects, for instance -- and how much the structure of the experiments and training affects the neural patterns.
Vargas-Irwin said he is optimistic that the findings could ultimately have direct application to improving brain-computer interface design and performance for patients with severe paralysis.
Story Source:
Materials provided by Brown University. Note: Content may be edited for style and length.
Journal Reference:
- C. E. Vargas-Irwin, L. Franquemont, M. J. Black, J. P. Donoghue. Linking Objects to Actions: Encoding of Target Object and Grasping Strategy in Primate Ventral Premotor Cortex. Journal of Neuroscience, 2015; 35 (30): 10888 DOI: 10.1523/JNEUROSCI.1574-15.2015
Cite This Page: