New! Sign up for our free email newsletter.
Science News
from research organizations

Understanding body language of mice

Scientists create new technology to read complex patterns of behavior

Date:
December 16, 2015
Source:
Harvard Medical School
Summary:
Mouse behavior naturally divides into movement motifs lasting less than a second. These 'behavioral syllables' can be reused by the brain to achieve specific goals.
Share:
FULL STORY

It might not rival Newton's apple, which led to his formulating the law of gravity, but the collapse of a lighting scaffold played a key role in the discovery that mice, like humans, have body language.

Harvard Medical School scientists have developed new computational techniques that can make sense of the bodily movements of mice, organizing them into notions of syllables and grammar. Along the way they also proposed a solution to a longstanding problem in neuroscience--how to objectively study complex three-dimensional patterns of animal behavior without relying on subjective human observers.

Their results appear in Neuron.

"If you look into the brain and ask how any individual brain cell fires as an animal generates any given behavior, what you find is the brain is a very noisy place, and so understanding how brain activity leads to action is hard," said Sandeep Datta, HMS assistant professor of neurobiology and senior author of the paper. "We think that by developing this method we can get new insight into how the brain creates behavior, and how that process goes wrong in models of disease. That's going to be a great way to build better and more targeted therapeutics."

Advances in genomic sequencing are revealing more high-risk genes for neurodevelopmental and neuropsychiatric disorders, Datta said. That makes it more urgent to develop better ways of studying how these genetic alterations change patterns of behavior in mouse models of human disease.

Their new method, based on machine-learning technology, allows them to understand, for the first time, the organization of behavior at fine time scales, which is key to understanding how individual genes or neural circuits influence bodily behavior. Datta's team hopes that this will lead to a better understanding of how the brain builds patterns of action.

This particular research project began when Alexander Wiltschko, the study's lead author and a graduate student in the Datta lab, was studying mice and odors. Sensory cues such as smell provide a window into the brain, revealing how information is converted into different patterns of action.

Wiltschko was closely observing videos of mice reacting to the scent of a fox, deciphering the constituent parts of those movements such as sniffing, freezing, rolling into a ball. Animals use repeated motifs of action, the scientists hypothesized, and they wanted to figure out what and how many there were, how long they lasted and what they looked like.

Because mice are nocturnal animals, researchers often study their behavior under infrared lights, which sometimes fall from their scaffolds.

Wiltschko had a better idea: Microsoft Kinect, a video-game peripheral device whose infrared camera captures movements and creates a three-dimensional on-screen avatar for playing virtual tennis or other games. Why not use that tool to record and then analyze mouse posture?

"When you have a mouse behaving in a particular way, you want to turn that behavior into some numbers that you can analyze," Wiltschko said.

The Microsoft Kinect solved more than the falling infrared light problem. Using computer vision, they pulled the mouse image out of the video depth screen to create a 3-D model of the mouse's body as it was moving. Matthew J. Johnson, a study co-author and an HMS research fellow in neurobiology, built a computational model that revealed how various poses and transitions within mouse movements were interrelated.

That's when the scientists noticed that changes in the mouse's posture could be organized into units.

Once the scientists analyzed the mouse-pose data, they realized that the movements were blocks of short, distinct pose sequences that appeared in many contexts, not just fear of the fox.

"That immediately made us think of the songbird literature, and by analogy, language: words that don't overlap with each other but happen in sequence over time," Wiltschko said.

In the Neuron paper they describe mouse movements as language, with its own syllables and grammar. When the mouse smells the odor of a fox, it avoids the fox not by creating new behaviors but by turning up the volume on behavioral components it was already using. When the mouse balls itself up to hide from the fox, for example, what changes is the degree to which it uses a pose that is present in other behaviors.

Their quantitative model uses statistics and probabilistic machine learning to understand mouse body language in generative modeling, a mathematical tool that allowed the researchers to dissect the strategy that the mouse uses to avoid a predator.

They tested the model with mice engineered to carry two copies of a genetic mutation that made them waddle, a striking difference that has been described many times before. But human observers couldn't see subtler abnormal behaviors that the waddling mice shared with mice engineered to have only one copy of the mutated gene. Experienced scientists thought the mice with just one faulty copy were normal, but the model showed that some of the walking syllables in the "normal" mice actually were not.

"Once we have this mathematical way of summarizing and thinking about the behavior the mouse is exhibiting, it gives us a handle on understanding what changes when we alter the environmental conditions," Johnson said. "What we've done is just the beginning, and we hope both scientists and people in statistics and machine learning can improve and build on these ideas."

Their method could be applied to large-scale screens to test psychoactive drugs in animals, the scientists said, to pinpoint effects on behavior that have previously been difficult to quantify and explain.

"The old way of characterizing and classifying behavior depends on what humans think behavior is," Datta said.

There is a long and rich history of people observing animals to understand what they do, the scientists noted. Ethologists in the 1800s would watch animals in mating season or when searching for food, seeing snippets of behavior repeated over an animal's lifetime before concluding those pieces of action were important.

"Our new way of characterizing and classifying behavior depends on the underlying structure hidden in the behavioral data itself," Datta said. "We've shown it's possible to objectively count the behaviors, to understand how they flow over each other over time, and to use that kind of framework to really get a deep understanding without any human bias for the underlying structure of actions."


Story Source:

Materials provided by Harvard Medical School. Original written by Elizabeth Cooney. Note: Content may be edited for style and length.


Journal Reference:

  1. Alexander B. Wiltschko , Matthew J. Johnson , Giuliano Iurilli , Ralph E. Peterson , Jesse M. Katon , Stan L. Pashkovski , Victoria E. Abraira , Ryan P. Adams , Sandeep Robert Datta. Mapping Sub-Second Structure in Mouse Behavior. Neuron, 16 December 2015 DOI: 10.1016/j.neuron.2015.11.031

Cite This Page:

Harvard Medical School. "Understanding body language of mice." ScienceDaily. ScienceDaily, 16 December 2015. <www.sciencedaily.com/releases/2015/12/151216134419.htm>.
Harvard Medical School. (2015, December 16). Understanding body language of mice. ScienceDaily. Retrieved December 21, 2024 from www.sciencedaily.com/releases/2015/12/151216134419.htm
Harvard Medical School. "Understanding body language of mice." ScienceDaily. www.sciencedaily.com/releases/2015/12/151216134419.htm (accessed December 21, 2024).

Explore More

from ScienceDaily

RELATED STORIES