New! Sign up for our free email newsletter.
Science News
from research organizations

Discovering the basics of 'active touch'

Study in mice identifies neurons that sense touch and motion, a combo needed to actively perceive the external world

Date:
April 20, 2017
Source:
Johns Hopkins Medicine
Summary:
Working with genetically engineered mice -- and especially their whiskers -- researchers report they have identified a group of nerve cells in the skin responsible for what they call 'active touch,' a combination of motion and sensory feeling needed to navigate the external world. The discovery of this basic sensory mechanism advances the search for better 'smart' prosthetics for people, ones that provide more natural sensory feedback to the brain during use.
Share:
FULL STORY

Working with genetically engineered mice -- and especially their whiskers -- Johns Hopkins researchers report they have identified a group of nerve cells in the skin responsible for what they call "active touch," a combination of motion and sensory feeling needed to navigate the external world. The discovery of this basic sensory mechanism, described online April 20 in the journal Neuron, advances the search for better "smart" prosthetics for people, ones that provide more natural sensory feedback to the brain during use.

Study leader Daniel O'Connor, Ph.D., assistant professor of neuroscience at the Johns Hopkins University School of Medicine, explains that over the past several decades, researchers have amassed a wealth of knowledge about the sense of touch. "You can open up textbooks and read all about the different types of sensors or receptor cells in the skin," he says. "However, almost everything we know is from experiments where tactile stimulation was applied to the stationary skin -- in other words, passive touch."

Such "passive touch," O'Connor adds, isn't how humans and other animals normally explore their world. For example, he says, people entering a dark room might search for a light switch by actively feeling the wall with their hands. To tell if an object is hard or soft, they'd probably need to press it with their fingers. To see if an object is smooth or rough, they'd scan their fingers back and forth across an object's surface.

Each of these forms of touch combined with motion, he says, is an active way of exploring the world, rather than waiting to have a touch stimulus presented. They each also require the ability to sense a body part's relative position in space, an ability known as proprioception.

While some research has suggested that the same populations of nerve cells, or neurons, might be responsible for sensing both proprioception and touch necessary for this sensory-motor integration, whether this was true and which neurons accomplish this feat have been largely unknown, O'Connor says.

To find out more, O'Connor and his team developed an experimental system with mice that allowed them to record electrical signals from specific neurons located in the skin, during both touch and motion.

The researchers accomplished this, they report, by working with members of a laboratory led by David Ginty, Ph.D., a former Johns Hopkins University faculty member, now at Harvard Medical School, to develop genetically altered mice. In these animals, a type of sensory neuron in the skin called Merkel afferents were mutated so that they responded to touch -- their "native" stimulus, and one long documented in previous research -- but also to blue light, which skin nerve cells don't normally respond to.

The scientists trained the rodents to run on a mouse-sized treadmill that had a small pole attached to the front that was motorized to move to different locations. Before the mice started running, the researchers used their touch-and-light sensitized system to find a single Merkel afferent near each animal's whiskers and used an electrode to measure the electrical signals from this neuron.

Much like humans use their hands to explore the world through touch, mice use their whiskers, explains O'Connor. Consequently, as the animals began running on the treadmill, they moved their whiskers back and forth in a motion that researchers call "exploratory whisking."

Using a high-speed camera focused on the animals' whiskers, the researchers took nearly 55,000,000 frames of video while the mice ran and whisked. They then used computer-learning algorithms to separate the movements into three different categories: when the rodents weren't whisking or in contact with the pole; when they were whisking with no contact; or when they were whisking against the pole.

They then connected each of these movements -- using video snapshots captured 500 times every second -- to the electrical signals coming from the animals' blue-light-sensitive Merkel afferents.

The results show that the Merkel afferents produced action potentials -- the electrical spikes that neurons use to communicate with each other and the brain -- when their associated whiskers contacted the pole. That finding wasn't particularly surprising, O'Connor says, because of these neurons' well-established role in touch.

However, he says, the Merkel afferents also responded robustly when they were moving in the air without touching the pole. By delving into the specific electrical signals, the researchers discovered that the action potentials precisely related to a whisker's position in space. These findings suggest that Merkel afferents play a dual role in touch and proprioception, and in the sensory-motor integration necessary for active touch, O'Connor says.

Although these findings are particular to mouse whiskers, he cautions, he and his colleagues believe that Merkel afferents in humans could serve a similar function, because many anatomical and physiological properties of Merkel afferents appear similar across a range of species, including mice and humans.

Besides shedding light on a basic biological question, O'Connor says, his team's research could also eventually improve artificial limbs and digits. Some prosthetics are now able to interface with the human brain, allowing users to move them using directed brain signals. While this motion is a huge advance beyond traditional static prosthetics, it still doesn't allow the smooth movement of natural limbs. By integrating signals similar to those produced by Merkel afferents, he explains, researchers might eventually be able to create prosthetics that can send signals about touch and proprioception to the brain, allowing movements akin to native limbs.


Story Source:

Materials provided by Johns Hopkins Medicine. Note: Content may be edited for style and length.


Journal Reference:

  1. Kyle S. Severson et al. Active Touch and Self-Motion Encoding by Merkel Cell-Associated Afferents. Neuron, April 2017 DOI: 10.1016/j.neuron.2017.03.045

Cite This Page:

Johns Hopkins Medicine. "Discovering the basics of 'active touch'." ScienceDaily. ScienceDaily, 20 April 2017. <www.sciencedaily.com/releases/2017/04/170420141822.htm>.
Johns Hopkins Medicine. (2017, April 20). Discovering the basics of 'active touch'. ScienceDaily. Retrieved November 20, 2024 from www.sciencedaily.com/releases/2017/04/170420141822.htm
Johns Hopkins Medicine. "Discovering the basics of 'active touch'." ScienceDaily. www.sciencedaily.com/releases/2017/04/170420141822.htm (accessed November 20, 2024).

Explore More

from ScienceDaily

RELATED STORIES