How the brain merges the senses
- Date:
- June 6, 2016
- Source:
- Universitaet Bielefeld
- Summary:
- Utilizing information from all the senses is critical for building a robust and rich representation of our surroundings. Given the wealth of multisensory information constantly bombarding us, however, how does our brain know which signals go together and thus need to be combined? And how does it integrate such related signals? Scientists have proposed a computational model that explains multisensory integration in humans utilizing a surprisingly simple processing unit.
- Share:
A sudden explosion, cracking sounds and flashing lights. In a blink of an eye, you realize that sounds and lights belong together, you look down and see firecrackers on the sidewalk. The human brain is surprisingly efficient at processing multisensory information. However, we still do not know how it solves the seemingly simple task of deciding whether sound and light belong together or not. "Figuring out a correspondence between the senses is by no means a trivial problem" says Dr. Cesare Parise, who works at CITEC in the research group Cognitive Neurosciences. Parise, who is also active at the Max-Planck Institute for Biological Cybernetics, is the lead author of the new study, which he wrote together with Professor Dr. Marc Ernst, who conducted research at Bielefeld University through March 2016. "Despite originating from the same physical events, visual and auditory information are processed in largely independent neural pathways, and yet, with no apparent effort, we can instantly tell which signals belong together. Such a task would be challenging, even for the most advanced robots."
To understand how humans combine visual and auditory information, volunteers agreed to par-ticipate to a perception experiment in which they observed random sequences of clicks and flashes. After each sequence, they had to report whether sound and light perceptually be-longed together, and which signal appeared first. Statistical analyses revealed that human responses were systematically determined by the similarity (i.e., correlation) of the temporal sequences of the clicks and flashes. "This is a very important finding," says Prof. March Ernst, "not just because it shows that the brain uses the temporal correlation of sound and light to detect whether or not they are physically related, but also because it opens an even more intriguing question: how does the brain detect correlation across the senses?"
To answer this question, Parise and Ernst used computational modeling and computer simulations, and identified an elementary neural mechanism that could closely replicate human perception. Such a mechanism -- called the Multisensory Correlation Detector -- monitors the senses and looks for similarity (correlation) across visual and auditory signals: if the stimuli have a similar temporal structure, the brain concludes that they belong together, and integrates the stimuli. Remarkably enough, this mechanism is surprisingly similar to the motion detectors found in the insect brain.
"This is exciting because it shows that the brain systematically exploits general-purpose processing strategies, which can be implemented across very different domains of perception where the correlation between signals is a key feature, such as the perception of visual motion, 3-D perception using binocular disparities, binaural hearing, and now multisensory processing. Furthermore, such correlation mechanisms can be found in very different animal species, from insects to vertebrates, including humans," says Prof. Marc Ernst, who has just accepted a new position at Ulm University. To further test the generalizability of this model, Parise and Ernst ran additional computer simulations, where they used the Multisensory Correlation Detector model to replicate several previous findings on the temporal and the spatial aspects of multisensory perception. Without further changes, the same model proved capable of replicating human perception in all simulated studies, and displayed the same temporal and spatial constraints of multisensory perception found in humans.
"Over the last decade we have discovered that the brain integrates multisensory information in a statistically optimal fashion. However, the nature of the underlying neural mechanisms has so far defied proper scientific explanation," says Dr. Cesare Parise. "This study marks a milestone in our understanding of human perception, as it provides for the first time a general mechanism capable of explaining a large variety of findings in multisensory perception."
"This result has strong application potential" says Dr. Parise, who has just accepted a new position as research scientist at Oculus VR (Facebook): "A deep understanding of multisensory processing opens new clinical perspectives for neurological syndromes that are associated with multisensory impairments, such as Autism Spectrum Disorder and Dyslexia. Moreover, our computational model could be easily implemented for use in robots and artificial perception."
The Department of Cognitive Neuroscience of the Biological Faculty is affiliated to the Cluster of Excellence Cognitive Interaction Technology (CITEC) at Bielefeld University. The focus of the Department of Cognitive Neuroscience is on human multisensory perception, sensorimotor integration, perceptual learning and human-machine interaction. The Department of Cognitive Neuroscience combines human psychophysical experimentation with computational modeling. It currently consists of 15 members from a variety of different disciplinary backgrounds, including biology, cognitive science, psychology, medicine, physics, and engineering.
Story Source:
Materials provided by Universitaet Bielefeld. Note: Content may be edited for style and length.
Journal Reference:
- Cesare V. Parise, Marc O. Ernst. Correlation detection as a general mechanism for multisensory integration. Nature Communications, 2016; 7: 11543 DOI: 10.1038/NCOMMS11543
Cite This Page: