New! Sign up for our free email newsletter.
Science News
from research organizations

How we hear distance: Echoes are essential for humans to perceive how far away a sound is

Date:
April 1, 2015
Source:
University of Connecticut
Summary:
Mammals are good at figuring out which direction a sound is coming from, whether it's a predator breathing down our necks or a baby crying for its mother. But how we judge how far away that sound is was a mystery until now. Researchers report that echoes and fluctuations in volume are the cues we use to figure the distance between us and the source of a noise.
Share:
FULL STORY

Mammals are good at figuring out which direction a sound is coming from, whether it's a rabbit with a predator breathing down its neck or a baby crying for its mother. But how we judge how far away that sound is was a mystery until now. Researchers from UConn Health report in the 1 April issue of the Journal of Neuroscience that echoes and fluctuations in volume (amplitude modulation) are the cues we use to figure the distance between us and the source of a noise.

"This opens up a new horizon," says Duck O. Kim, a neuroscientist at UConn Health. Researchers have long understood how we can tell a sound's direction--whether it's to our left or right, front or back, and above or below us. But how we tell how far away it is had remained a mystery. "The third dimension of sound location was pretty much unknown," says Kim.

All natural sounds, including speech, have amplitude modulation. Kim and his colleague Shigeyuki Kuwada suspected that amplitude modulation, and how echoes muddy it, were together key to our ability to perceive a sound's distance from us. To explore the idea, they used tiny microphones to record the sounds inside rabbits' ears as they played sounds at different locations. They used these recordings to simulate modulated or unmodulated noise coming from different distances from the rabbit. Then Kim and Kuwada played the simulated sounds back to the rabbit, and measured the responses of neurons in the rabbit's inferior colliculus (IC), a region of the midbrain known to be important for sound perception.

When the rabbit heard the simulated sounds, certain type of IC neurons fired more when the sound was closer and the depth of modulation was higher--that is, when there was a bigger difference between the sound's maximum and minimum amplitude.

Reverberations, or echoes, tend to degrade amplitude modulation, smoothing out the amplitude's peaks and valleys. Almost any environment has echoes as sounds bounce off of objects such as walls or trees, the ground, et cetera. The farther away the source of a sound is from a listener, the more echoes there are, and the more degraded the depth of amplitude modulation gets. As you would expect, the neurons fired less and less when the sound moved further away and the depth of amplitude modulation degraded more and more.

Pavel Zahorik, a researcher at the University of Louisville School of Medicine, tested the same amplitude modulated noise using human volunteers and got the same results: people need both amplitude modulation and reverberation to figure out how far away a sound is. Without amplitude modulation, a person can't tell how far away that noise is. Neither can she do it in an anechoic (echo-free) room.

"Reverberation is usually considered a bad thing," detrimental to hearing clearly, says Kuwada. "But it is necessary and beneficial in order to recognize distance."

Judging sound distance is a crucial survival skill, whether you're a bunny or a human--is that monster breathing down my neck, or huffing and puffing 20 yards behind me? Do I have time to cross the street before that car I hear in the distance pulls around the bend? Kim and Kuwada suggest that getting a better understanding of the acoustics and neuroscience of distance perception could contribute to making better hearing aids and prostheses, and perhaps reveal more subtle aspects of our sound perception. The importance of amplitude modulation is still poorly understood. Laurel Carney, a colleague at University of Rochester, modeled the ear and IC neural circuitry and replicated the neural firing patterns recording by Kim and Kuwada. The researchers hope that tweaking the model will give them more insight into the neurons' responses.

Kim and Kuwada's next step will be to do a two-eared study, and tie together the perception of distance, horizontal and vertical directions of sound.


Story Source:

Materials provided by University of Connecticut. Note: Content may be edited for style and length.


Journal Reference:

  1. D. O. Kim, P. Zahorik, L. H. Carney, B. B. Bishop, S. Kuwada. Auditory Distance Coding in Rabbit Midbrain Neurons and Human Perception: Monaural Amplitude Modulation Depth as a Cue. Journal of Neuroscience, 2015; 35 (13): 5360 DOI: 10.1523/JNEUROSCI.3798-14.2015

Cite This Page:

University of Connecticut. "How we hear distance: Echoes are essential for humans to perceive how far away a sound is." ScienceDaily. ScienceDaily, 1 April 2015. <www.sciencedaily.com/releases/2015/04/150401140833.htm>.
University of Connecticut. (2015, April 1). How we hear distance: Echoes are essential for humans to perceive how far away a sound is. ScienceDaily. Retrieved November 21, 2024 from www.sciencedaily.com/releases/2015/04/150401140833.htm
University of Connecticut. "How we hear distance: Echoes are essential for humans to perceive how far away a sound is." ScienceDaily. www.sciencedaily.com/releases/2015/04/150401140833.htm (accessed November 21, 2024).

Explore More

from ScienceDaily

RELATED STORIES