How the brain constructs the world
Insights into processes underlying the integration and storage of sensory signals
- Date:
- February 9, 2018
- Source:
- Scuola Internazionale Superiore di Studi Avanzati
- Summary:
- How are raw sensory signals transformed into a brain representation of the world that surrounds us? Investigators have now uncovered the contributions to perception of a brain region called posterior parietal cortex. They show that posterior parietal cortex contributes to the merging of signals from different sensory modalities, as well the formation of memories about the history of recent stimuli.
- Share:
How are raw sensory signals transformed into a brain representation of the world that surrounds us? The question was first posed over 100 years ago, but new experimental strategies make the challenge more exciting than ever. SISSA investigators have now uncovered the contributions to perception of a brain region called posterior parietal cortex. In two separate papers published in Neuron and Nature, they show that posterior parietal cortex contributes to the merging of signals from different sensory modalities, as well the formation of memories about the history of recent stimuli.
For decades, researchers have been investigating how the nervous system make sense of the signals brought into the brain by the sensory organs. Some basic facts are well known: sensory receptor cells convert external events -- for instance, light waves, skin vibration, or air pressure waves -- into electrical messages that enter the brain. But neuronal activity does not lead to conscious experience until it can be further elaborated in the cerebral cortex. Ongoing streams of sensory signals are transformed from representations of basic elements, in primary sensory cortical areas, into more complex combinations of features in higher-order areas; sensory events become meaningful once compared to recent and distant memories as well as to expectations. Though the scheme outlined above is backed up by countless studies, the physiological mechanisms remain unresolved.
In a study published in the journal Neuron, researchers at the International School for Advanced Studies -- SISSA investigated how the signals arriving through multiple senses are integrated. In daily life, once we are familiar with the combined sensory properties of an object, we can recognize that object (a banana, let us say) independently of the modality by which we receive the sensory signal -- by sight, by texture and shape, by taste. Nader Nikbakht and co-authors looked for the mechanisms by which knowledge about an object is triggered independently of the sensory channel engaged. The investigators trained rats to explore a grating made of raised black and white bars. The grating's orientation was reset randomly on each test trial and rats learned to approach the object and respond differently for two categories of orientation: 0±45 degrees ('horizontal') and 90±45 degrees ('vertical'). The rat was allowed to encounter the object through vision, touch, or else vision and touch together.
By comparing accuracy under the three conditions, the investigators found that the performance achieved in the combined visual-tactile condition was better than that predicted by the sum of individual signals: the two sensory channels work together efficiently to generate a better representation of the object. As the rats explored the object, the investigators also measured neuronal activity in the posterior parietal cortex (PPC), a region positioned between the primary sensory cortical areas for touch, vision and audition. Then they implemented a mathematical model to interpret the information carried by large sets of neurons. The model succeeded in accurately predicting how the rat would classify the stimulus on each single trial. A final analysis was particularly revealing: while neurons varied widely in how they encoded object orientation or category, a given neuron's response was identical under the three modality conditions. "This means that the message of the neurons was the object itself, not the sensory modality through which the object was explored" notes Mathew Diamond, senior investigator in the research.
"Because sensory signals originate with real objects that have multiple physical attributes, it is reasonable to expect that sensory systems have evolved to function in some intermeshed manner," observes Nikbakht. "In the mammalian nervous system, dedicated circuits integrate multiple modalities thus, boosting the quality of the percept. Perceiving an object results from fusing the individual senses, not merely summating them." Diamond adds that "posterior parietal cortex carries out one step in the transformation done by the cortex as a whole, allowing real things in the world around us to be recognized independently of the sensory system we employ. 'Supramodal' perception reflects the construction of a much more abstract representation than that based on simple object features."
In another study, published in Nature, Athena Akrami and co-authors investigated how recent sensory memories are formed and maintained. Suppose you have misplaced your cell phone. Its ringtone, repeated once every two seconds, alerts you to an incoming call: it must be somewhere in the house but you are not sure where. As you search for it from room to room, you store in memory the last ring's amplitude to compare it to the next, in this way determining whether you have moved closer or farther. But how stable is the memory of the last ring? For many years, neuroscientists have realized that as sensations fade away, their memory shifts towards the 'prior,' that is, the statistical mean of many recent stimuli. The neuronal mechanisms generating the prior are unknown. Akrami and co-investigators trained rats to compare, on each trial, the amplitudes of two auditory stimuli separated by a delay of several seconds, a task that resembles tracking down the cell phone by comparing the current ringtone to the preceding one. Hundreds of such trials (each containing two stimuli) followed each other in a sequence. The behavioral data revealed that as the rat awaited the second stimulus of the trial, the memory of the first stimulus shifted towards the mean of preceding stimuli.
The experiment thus confirmed the sliding of memory towards the expected value, a phenomenon that earlier studies have termed 'contraction bias.' But how does the brain apply prior statistics to the stimulus stored in memory? Because the PPC has been argued to be a critical locus for memory, Akrami's study focused on this area -- the same brain region examined by Nikbakht et al. Indeed, neuronal activity in PPC revealed a trace of the history of stimuli preceding the current trial. Akrami and colleagues then silenced activity in PPC during the task using optogenetics, a method that controls neuronal activity through light. Remarkably, suppressing PPC significantly improved the rats' performance. Why? The key is that memory of the first stimulus was not held in isolation, but instead was influenced by preceding trials. That is, the contraction bias reduced the fidelity of the stored sensory information, pulling the memory towards the prior. In many situations, contraction bias offers an enormous advantage: whenever available information is not exact, the prior is on average the best guess. However, when each sensory event is independent, like in Akrami's study, contraction bias reduces the accuracy of memories. Removing the influence of PPC by optogenetic silencing allowed the rats to remember each stimulus without the interference carried over from preceding trials. Neuroscientists now have, for the first time, a good indication of the cortical circuit that holds a record of previous sensory signals.
"In two separate studies, we were able to attribute two very different functions to the same cortical region, the posterior parietal cortex. And, in total, three sensory modalities were found to be integrated within PPC. This attests to the adaptability of cortical processing," says Diamond. "Primary sensory cortical regions receive signals from the sensory organs and carry a reliable message about the external world. But in all stages of cortical processing beyond this input level, the operations shift from moment to moment according to the behavioral task the whole organism needs to accomplish. Understanding how cortex identifies the current needs and adapts accordingly is an enormous challenge for the future."
Both studies were collaborative efforts, the first publication originating from collaboration within SISSA between the research groups led by Mathew Diamond and Davide Zoccolan, and the second between SISSA and Princeton University. Nader Nikbakht currently holds a postdoctoral position at the Massachusetts Institute of Technology, where he moved upon completion of his PhD and postdoctoral studies at SISSA. Athena Akrami currently holds a postdoctoral position at Princeton, having completed her PhD and postdoctoral studies at SISSA. A substantial part of the work described in the Nature paper was done at Princeton University in the laboratory of Carlos Brody.
Story Source:
Materials provided by Scuola Internazionale Superiore di Studi Avanzati. Note: Content may be edited for style and length.
Journal References:
- Nader Nikbakht, Azadeh Tafreshiha, Davide Zoccolan, Mathew E. Diamond. Supralinear and Supramodal Integration of Visual and Tactile Signals in Rats: Psychophysics and Neuronal Mechanisms. Neuron, 2018; 97 (3): 626 DOI: 10.1016/j.neuron.2018.01.003
- Athena Akrami, Charles D. Kopec, Mathew E. Diamond, Carlos D. Brody. Posterior parietal cortex represents sensory history and mediates its effects on behaviour. Nature, 2018; DOI: 10.1038/nature25510
Cite This Page: