New technology uses mouth gestures to interact in virtual reality
- Date:
- October 5, 2017
- Source:
- Binghamton University
- Summary:
- Researchers have developed a new technology that allows users to interact in a virtual reality environment using only mouth gestures.
- Share:
Researchers at Binghamton University, State University of New York have developed a new technology that allows users to interact in a virtual reality environment using only mouth gestures.
The proliferation of affordable virtual reality head-mounted displays provides users with realistic immersive visual experiences. However, head-mounted displays occlude the upper half of a user's face and prevent facial action recognition from the entire face. To combat this issue, Binghamton University Professor of Computer Science Lijun Yin and his team created a new framework that interprets mouth gestures as a medium for interaction within virtual reality in real-time.
Yin's team tested the application on a group of graduate students. Once a user put on a head-mounted display, they were presented with a simplistic game; the objective of the game was to guide the player's avatar around a forest and eat as many cakes as possible. Players had to select their movement direction using head rotation, move using mouth gestures and could only eat the cake by smiling. The system was able to describe and classify the user's mouth movements, and it achieved high correct recognition rates. The system has also been demonstrated and validated through a real-time virtual reality application.
"We hope to make this applicable to more than one person, maybe two. Think Skype interviews and communication," said Yin. "Imagine if it felt like you were in the same geometric space, face to face, and the computer program can efficiently depict your facial expressions and replicate them so it looks real."
Though the tech is still in the prototype phase, Yin believes his technology is applicable to a plethora of fields.
"The virtual world isn't only for entertainment. For instance, health care uses VR to help disabled patients," said Yin. "Medical professionals or even military personal can go through training exercises that may not be possible to experience in real life. This technology allows the experience to be more realistic."
Students Umur Aybars Ciftci and Xing Zhang contributed to this research.
The paper, "Partially occluded facial action recognition and interaction in virtual reality applications," was presented at the 2017 IEEE International Conference on Multimedia and Expo.
Story Source:
Materials provided by Binghamton University. Note: Content may be edited for style and length.
Journal Reference:
- UmurAybars Ciftci, Xing Zhang, Lijun Tin. Partially occluded facial action recognition and interaction in virtual reality applications. 2017 IEEE International Conference on Multimedia and Expo (ICME), 2017; DOI: 10.1109/ICME.2017.8019545
Cite This Page: