New! Sign up for our free email newsletter.
Science News
from research organizations

Haptics device creates realistic virtual textures

Date:
May 21, 2022
Source:
University of Southern California
Summary:
Tactile sensation is an incredibly important part of how humans perceive their reality. Haptics or devices that can produce extremely specific vibrations that can mimic the sensation of touch are a way to bring that third sense to life. However, as far as haptics have come, humans are incredibly particular about whether or not something feels 'right,' and virtual textures don't always hit the mark. Now, researchers have developed a new method for computers to achieve that true texture -- with the help of human beings. Called a preference-driven model, the framework uses our ability to distinguish between the details of certain textures as a tool in order to give these virtual counterparts a tune-up.
Share:
FULL STORY

Technology has allowed us to immerse ourselves in a world of sights and sounds from the comfort of our home, but there's something missing: touch.

Tactile sensation is an incredibly important part of how humans perceive their reality. Haptics or devices that can produce extremely specific vibrations that can mimic the sensation of touch are a way to bring that third sense to life. However, as far as haptics have come, humans are incredibly particular about whether or not something feels "right," and virtual textures don't always hit the mark.

Now, researchers at the USC Viterbi School of Engineering have developed a new method for computers to achieve that true texture -- with the help of human beings.

Called a preference-driven model, the framework uses our ability to distinguish between the details of certain textures as a tool in order to give these virtual counterparts a tune-up.

The research was published in IEEE Transactions on Haptics by three USC Viterbi Ph.D. students in computer science, Shihan Lu, Mianlun Zheng and Matthew Fontaine, as well as Stefanos Nikolaidis, USC Viterbi assistant professor in computer science and Heather Culbertson, USC Viterbi WiSE Gabilan Assistant Professor in Computer Science.

"We ask users to compare their feeling between the real texture and the virtual texture," Lu, the first author, explained. "The model then iteratively updates a virtual texture so that the virtual texture can match the real one in the end."

According to Fontaine, the idea first emerged when they shared a Haptic Interfaces and Virtual Environments class back in Fall of 2019 taught by Culbertson. They drew inspiration from the art application Picbreeder, which can generate images based on a user's preference over and over until it reaches the desired result.

"We thought, what if we could do that for textures?" Fontaine recalled.

Using this preference-driven model, the user is first given a real texture, and the model randomly generates three virtual textures using dozens of variables, from which the user can then pick the one that feels the most similar to the real thing. Over time, the search adjusts its distribution of these variables as it gets closer and closer to what the user prefers. According to Fontaine, this method has an advantage over directly recording and "playing back" textures, as there's always a gap between what the computer reads and what we feel.

"You're measuring parameters of exactly how they feel it, rather than just mimicking what we can record," Fontaine said. There's going to be some error in how you recorded that texture, to how you play it back."

The only thing the user has to do is choose what texture matches best and adjust the amount of friction using a simple slider. Friction is essential to how we perceive textures, and it can vary between the perceptions of person to person. It's "very easy," Lu said.

Their work comes just in time for the emerging market for specific, accurate virtual textures. Everything from video games to fashion design is integrating haptic technology, and the existing databases of virtual textures can be improved through this user preference method.

"There is a growing popularity of the haptic device in video games and fashion design and surgery simulation," Lu said. "Even at home, we've started to see users with those (haptic) devices that are becoming as popular as the laptop. For example, with first-person video games, it will make them feel like they're really interacting with their environment."

Lu previously did other work on immersive technology, but with sound -- specifically, making the virtual texture even more immersive by introducing matching sounds when the tool interacts with it.

"When we are interacting with the environment through a tool, tactile feedback is only one modality, one kind of sensory feedback," Lu said. "Audio is another kind of sensory feedback, and both are very important."

The texture-search model also allows for someone to take a virtual texture off of a database, like the University of Pennsylvania's Haptic Texture Toolkit, and refine them until they get the result they want.

"You can use the previous virtual textures searched by others, and then based on those, you can then continue tuning it," Lu said. "You don't have to search from scratch every time."

This especially comes in handy for virtual textures that are used in training for dentistry or surgery, which need to be extremely accurate, according to Lu.

"Surgical training is definitely a huge area that requires very realistic textures and tactile feedback," Lu said. "Fashion design also requires a lot of precision in texture in development, before they go and fabricate it."

In the future, real textures may not even be required for the model, Lu explained. The way certain things in our lives feel is so intuitive that fine-tuning a texture to match that memory is something we can do inherently just by looking at a photo, without having the real texture for reference in front of us.

"When we see a table, we can imagine how the table will feel once we touch it," Lu said. "Using this prior knowledge we have of the surface, you can just provide visual feedback to the users, and it allows them to choose what matches."


Story Source:

Materials provided by University of Southern California. Note: Content may be edited for style and length.


Journal Reference:

  1. Shihan Lu, Mianlun Zheng, Matthew C. Fontaine, Stefanos Nikolaidis, Heather Marie Culbertson. Preference-Driven Texture Modeling Through Interactive Generation and Search. IEEE Transactions on Haptics, 2022; 1 DOI: 10.1109/TOH.2022.3173935

Cite This Page:

University of Southern California. "Haptics device creates realistic virtual textures." ScienceDaily. ScienceDaily, 21 May 2022. <www.sciencedaily.com/releases/2022/05/220521093332.htm>.
University of Southern California. (2022, May 21). Haptics device creates realistic virtual textures. ScienceDaily. Retrieved November 20, 2024 from www.sciencedaily.com/releases/2022/05/220521093332.htm
University of Southern California. "Haptics device creates realistic virtual textures." ScienceDaily. www.sciencedaily.com/releases/2022/05/220521093332.htm (accessed November 20, 2024).

Explore More

from ScienceDaily

RELATED STORIES