Echo templates aid mental mapping in bats
- Date:
- August 2, 2016
- Source:
- eLife
- Summary:
- A new study provides insights on how bats recognize their surroundings to help them build mental maps.
- Share:
A study published in eLife provides new insights on how bats recognise their surroundings to help them build mental maps.
Bats have excellent spatial memory, and navigate with ease to important locations including roosts and foraging grounds. But exactly how these animals recognise such places through echolocation -- perception based on soundwaves and their echoes -- is largely unknown. New research from the Universities of Bristol and Antwerp suggests the animals observe and remember templates to help form a cognitive map of their environment.
"When we visually recognise places, such as our living room or office, we identify and localise the various objects that make up the scene," says Marc Holderied, PhD, Reader in Biology at the University of Bristol, and senior author of the study.
"Echolocation does not allow bats to do this, as the information it provides is more limited. We therefore wanted to discover how these animals recognise their locations differently to those with vision."
The team proposed that template-based place recognition might underlie sonar-based navigation in bats. This would mean that the animals recognise places by remembering their echo signature, rather than their three-dimensional (3D) layout.
"The viability of a template-based approach to place recognition relies on two properties. One of these is that templates must allow for unique classification in order for places to be recognisable. In other words, they must encode the bat's specific locations in space to allow it to recognise previously visited places," says first author Dieter Vanderelst, PhD, from the University of Antwerp, who led the study as a research fellow at the University of Bristol.
To test their hypothesis, the team built an 'artificial bat', a device which contained ultrasonic microphones and an ultrasonic speaker acting as ears and a mouth. Using this device, they collected a large number of echoes from three different locations: the green and leafy St. Andrew's Park and Royal Fort Garden in Bristol, and the more open and stonier landscape of a park in Midreshet Ben Gurion, Israel.
Data were collected at the typical bat-flight heights of about two to three metres. Measurements from each site were gathered and stored by a computer integrated into the device. The team then assessed the templates from the data and found that the echoes returning from each place were unique enough for them to be used to recognise the location.
"Importantly, our method used the echoes without inferring the location or identity of objects, such as plants and trees, at each site. In other words, the data support our hypothesis that bats can recognise places by remembering how they sound, rather than how they appear through the animals' 3D sonar imaging," Vanderelst explains.
The research also suggests that the use of prominent landmarks might be an emergent feature of template-based place recognition.
"The prominence of a template's catchment area reflects how likely it is that the template will be observed and stored in a map during exploration. For example, we found the catchment distances to be greater in the Israeli corridor of large boulders than in the corridor of vegetation in the Royal Fort Gardens, suggesting that bats could use the boulders as landmarks for mapping," Vanderelst adds.
"This leads us to believe that cognitive mapping based on templates would show a natural preference to use such landmarks, as they return stronger and more recognisable echo signatures. With these new insights in mind, our aim is to try and piece together the entire puzzle of the navigation tendencies and capabilities in bats."
Story Source:
Materials provided by eLife. Note: Content may be edited for style and length.
Journal Reference:
- Dieter Vanderelst, Jan Steckel, Andre Boen, Herbert Peremans, Marc W Holderied. Place recognition using batlike sonar. eLife, 2016; 5 DOI: 10.7554/eLife.14188
Cite This Page: