New! Sign up for our free email newsletter.
Science News
from research organizations

Brain Signals From The Paralyzed Or Injured Captured By Computer

Date:
November 13, 2003
Source:
Society For Neuroscience
Summary:
Exciting new research into how signals from the brain can be captured by a computer or other device to carry out an individual's command may allow people with motor disabilities to more fully communicate and function in their daily lives.
Share:
FULL STORY

Exciting new research into how signals from the brain can be captured by a computer or other device to carry out an individual's command may allow people with motor disabilities to more fully communicate and function in their daily lives.

Over the past several years, scientists have begun to address the needs of people with severe disabilities brought on by paralysis or injury by developing brain-computer interfaces (BCIs). These systems allow people to use signals directly from the brain for communication and control of movement. The research has progressed to a point where clinical applications can be anticipated, says Jonathan Wolpaw, MD, chair of the symposium, "Brain-Computer Interfaces for Communication and Control."

Research in technologies for obtaining brain signals for BCI applications has led to the development of implantable BCI devices that could be used by people with severe motor disabilities. In other work, investigators report advances in BCI-based movement control.

The BCIs already available and those under development differ greatly in the brain signals they use, in how they detect those signals, in the methods they use to translate the signals to carry out the person's commands, and in the kinds of devices the signals control.

Groundbreaking work conducted by Douglas J. Weber, PhD, at the University of Alberta, Edmonton, Canada, and his colleagues has led to the development of an implantable microelectrode array that can record neural sensory responses resulting from movements of the leg. The investigators have developed an analysis technique that allows accurate prediction of leg positions from the patterns of recorded neural activity.

The technique relies on the fact that multiple sensors acting together provide the central nervous system with important feedback for controlling movement. For example, sensors called muscle spindles that are embedded in muscle fibers measure the length and speed of muscle stretch, while other sensors in the skin respond to stretch and pressure. When an individual is paralyzed by injury or disease, neural signals from these sensors cannot reach the brain, and thus cannot be used to control motor responses. Paralysis also keeps neural signals originating in the motor regions of the brain from reaching the muscles.

The work of Weber and his colleagues shows that it is possible to extract feedback information from the body's natural sensors that could then be used to control a prosthetic device, allowing an individual to regain some command and control of his or her own movements.

A sterile surgical procedure is used to implant arrays of 36 microelectrodes into the dorsal root ganglion, part of the spinal nerve that contains the nerve cell bodies that house these natural sensors. Historically, it was difficult to record from these sensors because their cell bodies are located in this difficult-to-reach nerve bundle entering the spinal cord.

The wires from the microelectrode array are led out through the skin to a small electrical conductor. The procedure allows simultaneous recordings from many sensory nerves during normal motor activities such as walking. A digital camera tracks the position of the leg, and a mathematical analysis relates the sensory activity to leg movement. The investigators found that fewer than 10 neurons are needed to accurately predict the path of the leg. This finding is encouraging because it suggests that a small number of neurons could provide the feedback signals needed to control a prosthetic device.

Other investigators are developing wireless devices for recording neural activity. Groups from Brown University in Providence, R.I., and the Jet Propulsion Laboratory (JPL) in Pasadena, Calif., have both developed wireless implantable devices that use advanced microelectronic technology that eliminates the shortfalls of currently available neural recording systems.

According to JPL's Mohammad Mojarradi, PhD, the advantage of such wireless devices is that they allow recording of neural signals while an individual is moving and may pave the way for study of neural circuits responsible for even more complex mobility functions.

"Present implantable neural recording systems are passive devices, using a large bundle of wire and requiring the skull to stay still during the recording session," said Mojarradi. "Wireless devices allow recording of neural signals without restricting motion. Once this restriction is removed, we can look at complex motor functions and the neural circuits involved and potentially develop even larger highly advanced brain-machine systems."

The wireless device under development at JPL uses an array of analog, low-noise amplifiers that amplify signals from microelectrodes, an on-board processor, and a two-way radio link, which acts as a telemeter. A microprocessor interacts with the two-way radio link and can be remotely programmed to detect and sort the neural signals received by the prosthetic device. The device is designed to be placed under the skin on the skull and connected to recording electrodes in the cortex. These electrodes amplify and transmit the recorded signals from the brain through the wireless telemeter.

The researchers say the next step in their research is to create a single system-on-a-chip device by combining the analog array of amplifiers with the microprocessor and wireless link. Mojarradi and his team believe such a highly miniaturized system could lead to the development of a permanent implant to assist patients suffering from paralysis and other brain disorders.

In other work, Cyberkinetics, Inc., in Foxborough, Mass., is working with the Brown University team headed by Mijail D. Serruya, to develop an implantable BCI called BrainGate for clinical use in human patients. BrainGate is designed to enable patients who have lost the use of their hands to master accurate, rapid control over a computer desktop. Such control over a computer desktop could allow a patient to communicate online, interface with the Internet, and possibly adjust lights or other devices in his or her environment.

Cyberkinetics was founded by a team of researchers from Brown University led by John Donoghue, PhD. Cyberkinetics is seeking to commercialize a neural output device to help patients with severe motor impairment. All the authors are Cyberkinetics employees and shareholders.

"The disadvantage of the computerized assistive devices used today is that they require an individual to use substitute signals like voice or eye movement to manipulate a mouse or keyboard," said Donoghue. "The advantage of BrainGate is that it records directly from the brain and thus can translate brain activity into the intended hand movement over mouse or keyboard."

The BrainGate BCI consists of a microelectrode array sensor implanted into the motor cortex, an external cart containing computer hardware, and software that processes and decodes neural signals. Although no humans have been implanted with BrainGate yet, the device has been designed to meet human safety requirements.

Cyberkinetics hopes to start a pilot clinical trial with four to five quadriplegic individuals in 2004. Once the BrainGate device has been shown to record neural activity in paralyzed patients, then the team at Cyberkinetics will explore how the signals can be translated into output signals that could be used to control a computer.

In other work being presented at the symposium, Andrew Schwartz, PhD, will show how his group at the University of Pittsburgh is extending their work in cortical prostheses to robot control. Previously the group showed that closed-loop control of a cortical prosthesis can produce excellent brain-controlled movements in virtual reality. After showing that a monkey can use direct brain control to control a robotic arm in 3D space while watching the movement in virtual reality, the researchers are now moving on to have the animal see and control the arm directly, without the virtual reality display. Although learning to use the robot as a tool seems to be more difficult for the animal, it has nevertheless learned to use the robot to reach targets held by the investigator.

Jonathan Wolpaw's laboratory at the Wadsworth Center of the New York State Department of Health is finding that a non-invasive BCI, using EEG activity recorded from the scalp, can provide rapid multidimensional control of a movement signal with precision and speed comparable to that achieved in monkeys by invasive BCIs. This remarkable control by a non-invasive BCI depends on an adaptive training algorithm that identifies and focuses on the particular EEG features that the person is best able to control, and encourages the person to improve that control further. These initial results suggest that people with severe motor disabilities might use brain signals to operate a robotic arm or a neuroprosthesis without the risks involved in having electrodes implanted in their brains.

Andrea Kuebler, PhD, of the University of Tuebingen in Germany is exploring the complex technical and human issues involved in providing severely disabled people with BCI-driven communication and control devices. Her group has compelling new data indicating that simple BCIs could greatly improve the quality of life of people with the most severe disabilities. These data imply that even people who are totally paralyzed can lead lives they enjoy if they can communicate even to a limited extent with caregivers, family members, and friends. By showing the potential clinical benefits of BCIs, these surprising results provide new incentive for their continued development and application.


Story Source:

Materials provided by Society For Neuroscience. Note: Content may be edited for style and length.


Cite This Page:

Society For Neuroscience. "Brain Signals From The Paralyzed Or Injured Captured By Computer." ScienceDaily. ScienceDaily, 13 November 2003. <www.sciencedaily.com/releases/2003/11/031113065702.htm>.
Society For Neuroscience. (2003, November 13). Brain Signals From The Paralyzed Or Injured Captured By Computer. ScienceDaily. Retrieved December 21, 2024 from www.sciencedaily.com/releases/2003/11/031113065702.htm
Society For Neuroscience. "Brain Signals From The Paralyzed Or Injured Captured By Computer." ScienceDaily. www.sciencedaily.com/releases/2003/11/031113065702.htm (accessed December 21, 2024).

Explore More

from ScienceDaily

RELATED STORIES