Study of how brain corrects perceptual errors has implications for brain injuries, robotics
- Date:
- March 24, 2011
- Source:
- University of California - Los Angeles
- Summary:
- New research provides the first evidence that sensory recalibration -- the brain's automatic correcting of errors made by our sensory or perceptual systems -- can occur instantly.
- Share:
New research provides the first evidence that sensory recalibration -- the brain's automatic correcting of errors in our sensory or perceptual systems -- can occur instantly.
"Until recently, neuroscientists thought of sensory recalibration as a mechanism that is primarily used for coping with long-term changes, such as growth during development, brain injury or stroke," said Ladan Shams, a UCLA assistant professor of psychology and an expert on perception and cognitive neuroscience. "It appeared that extensive time, and thus many repetitions of error, were needed for mechanisms of recalibration to kick in. However, our findings indicate we don't need weeks, days, or even minutes or seconds to adapt. To some degree, we adapt instantaneously.
"If recalibration can occur in milliseconds, as we now think, then we can adapt even to transient changes in the environment and in our bodies."
In Shams' study, reported in the March 23 issue of the Journal of Neuroscience, 146 individuals, primarily UCLA undergraduates, performed what is known as a fundamental perceptual task. They looked at the center of a large screen that had eight speakers hidden behind it. Sometimes they heard only a brief burst of sound somewhat like radio static; sometimes they saw only a quick flash of light; and sometimes they both heard a sound and saw a light. They were asked to determine where the sound was and where the light was.
The participants, the researchers found, were much more accurate in determining where the light was than where the sound was.
The 'ventriloquist illusion'
"The perceived location of sound gets shifted toward the location of the visual stimulus," Shams said. "That is known as the 'ventriloquist illusion.' If I repeatedly, for thousands of times, present a flash of light on the left side and a sound on the right side, afterwards, even when the sound is presented alone, the perceived location of sound will be shifted to the left, toward where the flash was. The visual stimulus affects the perception of the sound, not only while it is present, but also as an after-effect. This phenomenon has been known, but neuroscientists thought it required a large number of repeated exposures.
"We found this shift can happen not after thousands of trials, but after just a single trial. A small fraction of a second is enough to cause this perceptual shift. These findings provide the first evidence that sensory recalibration can occur rapidly, after only milliseconds. This indicates that recalibration of auditory space does not require the accumulation of substantial evidence of error to become engaged, and instead it is operational continuously."
In the study, the subjects were presented with a variety of different combinations. For example, in one trial the flash could be 10 degrees to the right of the sound; in the next, it could be 15 degrees to the left of the sound; then there could be sound and no flash; then flash and no sound; then the sound and flash could be in the same location.
"For every trial that contained sound alone (with no flash), we studied how the subjects located the sound in relation to what they experienced in the previous trial, where there was a flash. We found a very strong correlation; if the flash was to the right of the sound in the previous trial, then on the trial with the sound alone, the sound was perceived a little to the right; if the flash was to the left of the sound on the previous trial, then on the trial with sound alone, the sound was perceived a little to the left. The larger the discrepancy, the larger the shift."
While the subjects seemed to be making perceptual errors rather than correcting them, Shams stressed that this was an unnatural environment in which researchers artificially created a discrepancy between auditory and visual stimuli to show how quickly recalibration could occur.
In the real world, she said, recalibration would actually result in a reduction in errors in a person experiencing an auditory-visual discrepency due to a flaw in one of their senses.
Implications for rehabilitation, robotics
This research could have implications for rehabilitation from brain injuries and could help in the development of prosthesis, when, for example, people get hearing devices and can use vision to guide their learning of how to localize sound. It also has implications for the design of robotic recalibration, which could be useful for aircraft as well as robots.
Our senses are similar to those of a robot. NASA's Mars Rovers, for instance, are sampling the planet's surface using cameras, sensors, microphones and other equipment, which, like our senses, can get damaged. If a camera becomes misaligned across the rocky terrain, its function will be diminished.
"Sensory recalibration is a critical function for both biological and artificial systems," Shams said. "As with artificial sensors, biological sensory systems can become faulty and need correction every now and then."
Ailments such as a blocked ear canal or a problem with our sense of smell or vision can lead to distorted perceptions -- or shifts -- in our spatial map. If there is a systematic error in our auditory system, it needs to be corrected. When biological sensory systems become faulty, the brain typically provides the correction automatically.
"Fortunately, human sensory systems already possess the uncanny ability to recalibrate their own localization maps through the interactions between visual and auditory systems," Shams said. "Our new findings show that the multisensory recalibration is continuously functioning after only milliseconds of sensory discrepancies, allowing for rapid adaptation to changes in sensory signals. This rapid adaptation allows not only adaptation to long-term changes such as those induced by injury and disease, but also adaptation to transient changes, such as changes in the echo properties of our surrounding space as we walk from one room to another room or from indoors to outdoors, or when one ear is temporarily blocked by hair or headwear."
The research by Shams and David Wozny -- who earned his Ph.D. from UCLA in August in Shams' laboratory, and is currently a postdoctoral fellow at Oregon Health and Science University -- is shedding light for the first time on the dynamics of sensory recalibration. They have learned, for example, that repeated exposures will increase the shift, which accumulates quickly before slowing down.
"Vision is teaching hearing," Shams said. "If vision tells me one time that sound is not here (indicating her left), but here (her right), then I shift my auditory map a little; if it happens twice in a row, I shift even more. If it happens three times in a row, I shift even more."
An optimal learning strategy?
Using the same set of data, Shams and Wozny published in the Aug. 5, 2010, issue of the journal PLoS Computational Biology a computational model that allows them to analyze why subjects perceive the sounds and sights in a particular way and what computations occur in their brains when they hear the sounds and see flashes. (Ulrik Beierholm, a former UCLA graduate student of Shams, who is currently a postdoctoral fellow at University College London's Gatsby Computational Neuroscience Unit, was a co-author.)
"By analyzing the data using three models, we can determine which model best explains the data and can characterize the strategy the subjects' brains use to make perceptual decisions, Shams said.
Determining the locations of sights and sounds is a basic brain function, and scientists assume that such functions are performed optimally because they have been refined over millions of years of evolution, Shams said. Because this is a basic task, neuroscientists would expect almost all brains to perform it in the same way.
"Surprisingly, we found the perceptual task is not performed uniformly across subjects. Different people use different strategies to perform this task," Shams said. "Secondly, the vast majority of people, at least 75 percent, use a strategy that is considered seriously sub-optimal."
What is this sub-optimal strategy? By way of analogy, Shams says, if there is a 70 percent chance of rain, you would be wise to take an umbrella with you.
"What we found is that instead of people taking the umbrella every time there is, say, a 70 percent chance of rain, so to speak, they match the probability: They take the umbrella only 70 percent of the time," she said.
When subjects were presented with a noise and a flash and were asked where they perceived the noise and flash to be coming from, their brains had to figure out whether the sound and flash were coming from the same location or from different locations.
"If they infer there is a 70 percent chance that the sound and flash are coming from the same object, for the majority of observers, 70 percent of the time they go with that estimate and 30 percent of the time they go with the unlikely estimate," Shams said. "Under conventional measures of optimality, which implicitly assume static environments, this strategy is highly suboptimal.
"However, the conventional way of thinking about these problems may not be correct after all. In a dynamic world, things may change constantly. The optimal strategy is to learn, and to learn you need to take some risks. Even if that's not the best choice at that time, in the long run, it may well be the best choice, because by exploring different possibilities, you may learn more. So paradoxically, a strategy that appears sub-optimal may actually be near-optimal. Perhaps the way we think about brain function should be revised."
The research was funded by a National Institutes of Health neuroimaging training grant, by UCLA's Graduate Division and by UCLA's Academic Senate.
Story Source:
Materials provided by University of California - Los Angeles. Note: Content may be edited for style and length.
Journal Reference:
- D. R. Wozny, L. Shams. Recalibration of Auditory Space following Milliseconds of Cross-Modal Discrepancy. Journal of Neuroscience, 2011; 31 (12): 4607 DOI: 10.1523/JNEUROSCI.6079-10.2011
Cite This Page: