Robots get a feel for the world: Touch more sensitve than a human's
- Date:
- June 18, 2012
- Source:
- University of Southern California
- Summary:
- What does a robot feel when it touches something? Little or nothing until now. Specially designed robots can now be equipped with a sense of touch even more sensitive than that of humans.
- Share:
What does a robot feel when it touches something? Little or nothing until now. But with the right sensors, actuators and software, robots can be given the sense of feel -- or at least the ability to identify different materials by touch.
Researchers at the University of Southern California's Viterbi School of Engineering published a study June 18 in Frontiers in Neurorobotics showing that a specially designed robot can outperform humans in identifying a wide range of natural materials according to their textures, paving the way for advancements in prostheses, personal assistive robots and consumer product testing.
The robot was equipped with a new type of tactile sensor built to mimic the human fingertip. It also used a newly designed algorithm to make decisions about how to explore the outside world by imitating human strategies. Capable of other human sensations, the sensor can also tell where and in which direction forces are applied to the fingertip and even the thermal properties of an object being touched.
Like the human finger, the group's BioTac® sensor has a soft, flexible skin over a liquid filling. The skin even has fingerprints on its surface, greatly enhancing its sensitivity to vibration. As the finger slides over a textured surface, the skin vibrates in characteristic ways. These vibrations are detected by a hydrophone inside the bone-like core of the finger. The human finger uses similar vibrations to identify textures, but the robot finger is even more sensitive.
When humans try to identify an object by touch, they use a wide range of exploratory movements based on their prior experience with similar objects. A famous theorem by 18th century mathematician Thomas Bayes describes how decisions might be made from the information obtained during these movements. Until now, however, there was no way to decide which exploratory movement to make next. The article, authored by Professor of Biomedical Engineering Gerald Loeb and recently graduated doctoral student Jeremy Fishel, describes their new theorem for solving this general problem as "Bayesian Exploration."
Built by Fishel, the specialized robot was trained on 117 common materials gathered from fabric, stationery and hardware stores. When confronted with one material at random, the robot could correctly identify the material 95% of the time, after intelligently selecting and making an average of five exploratory movements. It was only rarely confused by pairs of similar textures that human subjects making their own exploratory movements could not distinguish at all.
So, is touch another task that humans will outsource to robots? Fishel and Loeb point out that while their robot is very good at identifying which textures are similar to each other, it has no way to tell what textures people will prefer. Instead, they say this robot touch technology could be used in human prostheses or to assist companies who employ experts to assess the feel of consumer products and even human skin.
Loeb and Fishel are partners in SynTouch LLC, which develops and manufactures tactile sensors for mechatronic systems that mimic the human hand. Founded in 2008 by researchers from USC's Medical Device Development Facility, the start-up is now selling their BioTac sensors to other researchers and manufacturers of industrial robots and prosthetic hands.
Story Source:
Materials provided by University of Southern California. Note: Content may be edited for style and length.
Journal Reference:
- Jeremy A. Fishel, Gerald E. Loeb. Bayesian Exploration for Intelligent Identification of Textures. Frontiers in Neurorobotics, 2012; 6 DOI: 10.3389/fnbot.2012.00004
Cite This Page: