Neuroscientists decode brain maps to discover how we take aim
- Date:
- September 10, 2014
- Source:
- York University
- Summary:
- A new brain map shows how the brain encodes allocentric and egocentric space in different ways during activities that involve manual aiming. The study finding will help healthcare providers to develop therapeutic treatment for patients with brain damage in these two areas, according to the neuroscientists.
- Share:
Serena Williams won her third consecutive US Open title a few days ago, thanks to reasons including obvious ones like physical strength and endurance. But how much did her brain and its egocentric and allocentric functions help the American tennis star retain the cup?
Quite significantly, according to York University neuroscience researchers whose recent study shows that different regions of the brain help to visually locate objects relative to one's own body (self-centred or egocentric) and those relative to external visual landmarks (world-centred or allocentric).
"The current study shows how the brain encodes allocentric and egocentric space in different ways during activities that involve manual aiming," explains Distinguished Research Professor Doug Crawford, in the Department of Psychology. "Take tennis for example. Allocentric brain areas could help aim the ball toward the opponent's weak side of play, whereas the egocentric areas would make sure your muscles return the serve in the right direction."
The study finding will help healthcare providers to develop therapeutic treatment for patients with brain damage in these two areas, according to the neuroscientists at York Centre for Vision Research. "As a neurologist, I am excited by the finding because it provides clues for doctors and therapists how they might design different therapeutic approaches," says Ying Chen, lead researcher and PhD candidate in the School of Kinesiology and Health Science.
The study, "Allocentric versus Egocentric Representation of Remembered Reach Targets in Human Cortex," published in the Journal of Neuroscience, was conducted using the state-of-the-art fMRI scanner at York U's Sherman Health Science Research Centre. A dozen participants were tested using the scanner, which Chen modified to distinguish brain areas relating to these two functions.
The participants were given three different tasks to complete when viewing remembered visual targets: egocentric reach (remembering absolute target location), allocentric reach (remembering target location relative to a visual landmark) and a nonspatial control, colour report (reporting color of target).
When participants remembered egocentric targets' locations, areas in the upper occipital lobe (at the back of the brain) encoded visual direction. In contrast, lower areas of the occipital and temporal lobes encoded object direction relative to other visual landmarks. In both cases, the parietal and frontal cortex (near the top of the brain) coded reach direction during the movement.
Story Source:
Materials provided by York University. Note: Content may be edited for style and length.
Journal Reference:
- Ying Chen et al. Allocentric versus Egocentric Representation of Remembered Reach Targets in Human Cortex. Journal of Neuroscience, September 2014 DOI: 10.1523/JNEUROSCI.1445-14
Cite This Page: