New! Sign up for our free email newsletter.
Science News
from research organizations

Tactile input affects what we hear

Date:
January 5, 2010
Source:
University of British Columbia
Summary:
Humans use their whole bodies, not just their ears, to understand speech, according to new linguistics research. It is well known that humans naturally process facial expression along with what is being heard to fully understand what is being communicated. This study is the first to show we also naturally process tactile information to perceive sounds of speech.
Share:
FULL STORY

Humans use their whole bodies, not just their ears, to understand speech, according to University of British Columbia linguistics research.

It is well known that humans naturally process facial expression along with what is being heard to fully understand what is being communicated. The UBC study is the first to show we also naturally process tactile information to perceive sounds of speech.

Prof. Bryan Gick of UBC's Dept. of Linguistics, along with PhD student Donald Derrick, found that air puffs directed at skin can bias perception of spoken syllables. "This study suggests we are much better at using tactile information than was previously thought," says Gick, also a member of Haskins Laboratories, an affiliate of Yale University.

The study, published in Nature November 26, offers findings that may be applied to telecommunications, speech science and hearing aid technology.

English speakers use aspiration -- the tiny bursts of breath accompanying speech sounds -- to distinguish sounds such as "pa" and "ta" from unaspirated sounds such as "ba" and "da." Study participants heard eight repetitions of these four syllables while inaudible air puffs -- simulating aspiration -- were directed at the back of the hand or the neck.

When the subjects -- 66 men and women -- were asked to distinguish the syllables, it was found that syllables heard simultaneously with air puffs were more likely to be perceived as aspirated, causing the subjects to mishear "ba" as the aspirated "pa" and "da" as the aspirated "ta." The brain associated the air puffs felt on skin with aspirated syllables, interfering with perception of what was actually heard.

It is unlikely aspirations are felt on the skin, say the researchers. The phenomenon is more likely analogous to lip-reading where the brain's auditory cortex area activates when the eyes see lips move, signaling speech. From the brain's point of view, you are "hearing" with your eyes.

"Our study shows we can do the same with our skin, "hearing" a puff of air, regardless of whether it got to our brains through our ears or our skin," says Gick.

Future research may include studies of how audio, visual and tactile information interact to form the basis of a new multi-sensory speech perception paradigm. Additional studies may examine how many kinds of speech sounds are affected by air flow, offering important information about how people interact with their physical environment.


Story Source:

Materials provided by University of British Columbia. Note: Content may be edited for style and length.


Cite This Page:

University of British Columbia. "Tactile input affects what we hear." ScienceDaily. ScienceDaily, 5 January 2010. <www.sciencedaily.com/releases/2009/11/091130103733.htm>.
University of British Columbia. (2010, January 5). Tactile input affects what we hear. ScienceDaily. Retrieved November 7, 2024 from www.sciencedaily.com/releases/2009/11/091130103733.htm
University of British Columbia. "Tactile input affects what we hear." ScienceDaily. www.sciencedaily.com/releases/2009/11/091130103733.htm (accessed November 7, 2024).

Explore More

from ScienceDaily

RELATED STORIES