New! Sign up for our free email newsletter.
Science News
from research organizations

NeuroMechFly v2: Simulating how fruit flies see, smell, and navigate

Date:
November 14, 2024
Source:
Ecole Polytechnique Fédérale de Lausanne
Summary:
Scientists have advanced their NeuroMechFly model, simulating fruit fly movement in the real world. With integrated vision and smell, NeuroMechFly v2 helps us understand brain-body coordination, setting a path for neuroengineering's role in robotics and AI.
Share:
FULL STORY

All animals, large or small, have to move at an incredible precision to interact with the world. Understanding how the brain controls movement is a fundamental question in neuroscience. For larger animals, this is challenging because of the complexity of their brains and nervous systems. But the fruit fly, Drosophila melanogaster, has a smaller and therefore more easily mappable brain, allowing scientists to gain detailed insights into how its nervous system drives behavior.

To understand how the nervous system controls actions, researchers at the group of Pavan Ramdya at EPFL created a simulated reality where a virtual fly can operate and respond the way real flies do. This program, known as NeuroMechFly v2, implements a neuromechanical model that goes beyond basic motor functions. By incorporating visual and olfactory sensing, complex terrains, and fine motor feedback, NeuroMechFly v2 simulates how a fruit fly would navigate through its environment while reacting to sights, smells, and obstacles.

Ramdya's research has focused on digitally replicating the principles underlying Drosophila motor control. In 2019, his group published DeepFly3D, a software that uses deep learning to capture how fly's legs move based on images from multiple cameras. In 2021, Ramdya's team revealed LiftPose3D, a method for reconstructing 3D animal poses from images taken from a single camera. These efforts were complemented by his 2022 publication of NeuroMechFly, a first morphologically accurate digital "twin" of Drosophila.

With the second iteration of NeuroMechFly, the researchers have now added detailed features that mimic real fly anatomy and physiology. For example, they carefully updated the leg and joint angles to better match the biomechanics of real fruit fly movements. The model's "brain" can now process visual and olfactory information through virtual eyes and antennae, giving it a sensory experience close to that of an actual fruit fly.

This setup lets NeuroMechFly v2 simulate different control strategies for real-life tasks such as walking over rough terrain or turning in response to smells and visual cues. The team has demonstrated realistic fly behavior under different conditions. For instance, the model can track a moving object visually or navigate towards an odor source, while avoiding obstacles in its path.

NeuroMechFly also allows researchers to infer neural activities in the brain based on the fly's experience in the virtual world. "By interfacing NeuroMechFly v2 with a recently published computational model of the fly's visual system, researchers can read out not only what the fly is seeing in the simulated environment, but also how real neurons might be responding," says Sibo Wang-Chen, who led the research.

With access to these neural activities, the scientists modelled how the fly might chase another fly -- for example, during courtship -- in a biologically plausible way. This was possible due to the hierarchical control system in the model, which lets higher-level "brain" functions interact with the lower-level motor functions -- an organization that mimics how real animals process sensory input and control their bodies.

Finally, researchers can also use NeuroMechFly v2 to study how the brain integrates sensory signals to maintain an awareness of the animal's state. To demonstrate this, Ramdya's team replicated the fly's ability to use feedback signals from leg movements to keep track of its location -- a behavior called path integration. This feature allows the simulated fly to "know" where it is, even when its visual inputs are limited. This kind of closed-loop sensory processing is a hallmark of biological intelligence and a critical milestone for neuroengineering.

Taken together, NeuroMechFly v2 enables researchers to investigate how the brain controls crucial behaviors using computational models. This paves the way for deeper insights into brain-body coordination, especially for species with complex sensory-motor systems. In the future, this model could serve as a blueprint for designing robots that navigate using sensory cues, such as tracking odors or adjusting movements to stabilize visuals, like real animals exploring their environments.

By improving machine learning models that control these simulations, researchers can also learn how animal intelligence can pave the way for AI systems that are more autonomous, robust, and responsive to their surroundings.


Story Source:

Materials provided by Ecole Polytechnique Fédérale de Lausanne. Original written by Nik Papageorgiou. The original text of this story is licensed under Creative Commons CC BY-SA 4.0. Note: Content may be edited for style and length.


Journal Reference:

  1. Sibo Wang-Chen, Victor Alfred Stimpfling, Thomas Ka Chung Lam, Pembe Gizem Özdil, Louise Genoud, Femke Hurtak, Pavan Ramdya. NeuroMechFly v2: simulating embodied sensorimotor control in adult Drosophila. Nature Methods, 2024; DOI: 10.1038/s41592-024-02497-y

Cite This Page:

Ecole Polytechnique Fédérale de Lausanne. "NeuroMechFly v2: Simulating how fruit flies see, smell, and navigate." ScienceDaily. ScienceDaily, 14 November 2024. <www.sciencedaily.com/releases/2024/11/241112123423.htm>.
Ecole Polytechnique Fédérale de Lausanne. (2024, November 14). NeuroMechFly v2: Simulating how fruit flies see, smell, and navigate. ScienceDaily. Retrieved November 14, 2024 from www.sciencedaily.com/releases/2024/11/241112123423.htm
Ecole Polytechnique Fédérale de Lausanne. "NeuroMechFly v2: Simulating how fruit flies see, smell, and navigate." ScienceDaily. www.sciencedaily.com/releases/2024/11/241112123423.htm (accessed November 14, 2024).

Explore More

from ScienceDaily

RELATED STORIES