New! Sign up for our free email newsletter.
Science News
from research organizations

New Lab Could Help Robots "Feel" More Like Humans

Date:
September 28, 2000
Source:
Johns Hopkins University
Summary:
If robots are going to have a sense of "touch" to go with their senses of sight, hearing and smell, someone needs to build "fingers" for them. A young Johns Hopkins mechanical engineer is working on it.
Share:
FULL STORY

Mechanical Fingers May Soon Explore Textures, Shapes for Physicians, Geologists

Allison Okamura wants to give robots the gift of touch. To do this, Okamura, who has just joined The Johns Hopkins University's engineering faculty, is setting up the school's first lab dedicated to the cutting-edge field of robotic haptic exploration. "'Haptic' means anything related to the sense of touch," Okamura says. "One part of my work involves robotic fingers. I program them to explore unknown environments and give them tactile sensing and force sensing. I try to emulate the human ability to manipulate, touch and explore."

Her work is part of a larger effort to create more sophisticated machines to take over tasks that are too dangerous, too tedious or too difficult for humans. To achieve this goal, many researchers are working on systems that give robots "eyes" to identify objects and avoid obstacles. But Okamura is one of the few engineers trying to replicate the sense of touch. "Vision is obviously very important," she says. "But if you can imagine going through life only seeing things but never being able to touch them, it's obvious that touch is also very important. 'Touch' is also something that's very difficult to get robots to do. Vision is a passive sense¡you can look at something without affecting it. But in order to touch something, the robot has to interact with the object and manipulate it."

Though robots with a sense of touch may be difficult to build, they could produce important payoffs. For salvage operations and scientific expeditions, the U.S. Navy wants robots that could run their fingers along objects resting on the floor of an ocean. NASA is interested in robotic hands that could transmit information about the strength and texture of rocks on other planets. Here on Earth, surgical robots with a fine sense of touch could "feel" the difference between a blood vessel and a bone. At Johns Hopkins, Okamura plans to work with engineers and physicians who have established a center devoted to the use of robots and computers in medicine.

Okamura received her doctorate in mechanical engineering from Stanford University shortly before she joined Johns Hopkins as an assistant professor of mechanical engineering. As a graduate student, she worked on a robot equipped with two soft fingertips made of rubber-coated foam. Tiny nibs on the rubber coverings behaved like the skin on human fingers, helping the robot sense and grasp unfamiliar objects. Using specialized tactile sensors and control methods, these robotic fingers explored objects to gather information about surface properties such as cracks, ridges and textures. In her new lab at Johns Hopkins, Okamura plans to build on this research and develop a new robotic finger with a sphere at the tip, capable of rotating like a paint roller. "A sphere like this could move all over a surface," she says. "It would be excellent for exploration. I'm hoping to build a system that can recognize features first on a hard surface and later on a soft surface, which is more difficult."

Her new Haptic Exploration Lab also will focus on a related field: using computers and a specially constructed joystick, stylus or glove to transmit sensory information to human hands. Such haptic interfaces allow users to "feel" objects that exist in a virtual environment. For example, if a user bumps into a tree or kicks a soccer ball within a computer-generated world, the joystick vibrates or provides force feedback to make the cyber-encounter feel real.

Haptic interfaces like this can add entertaining new dimensions to computer games or educational programs. But Okamura says this technology also could help a surgeon practice a delicate operation without risk to a human patient. Similarly, it could allow a geologist on Earth to "feel" the texture of a boulder discovered by a robotic exploration device on Mars. For her lab, Okamura recently acquired an experimental 3GM Haptic Interface from the San Jose-based Immersion Corp. The device, which Okamura helped develop at Immersion, has a stylus that allows the user to "feel" three-dimensional objects in a virtual environment. She plans to write new software for the device so that it can be used for medical applications.

Perfecting the mechanical devices and software needed to simulate the human sense of touch is a challenge that could take decades, but Okamura is eager to conduct some of the basic research. "Human beings, obviously, have amazing tactile sensing ability," she says. "What we've done so far with robots doesn't even come close. There's a lot of work to be done."

Color images available of Professor Okamura and a haptic interface; contact Phil Sneiderman.

Her new Haptic Exploration Lab also will focus on a related field: using computers and a specially constructed joystick, stylus or glove to transmit sensory information to human hands. Such haptic interfaces allow users to "feel" objects that exist in a virtual environment. For example, if a user bumps into a tree or kicks a soccer ball within a computer-generated world, the joystick vibrates or provides force feedback to make the cyber-encounter feel real.

Haptic interfaces like this can add entertaining new dimensions to computer games or educational programs. But Okamura says this technology also could help a surgeon practice a delicate operation without risk to a human patient. Similarly, it could allow a geologist on Earth to "feel" the texture of a boulder discovered by a robotic exploration device on Mars. For her lab, Okamura recently acquired an experimental 3GM Haptic Interface from the San Jose-based Immersion Corp. The device, which Okamura helped develop at Immersion, has a stylus that allows the user to "feel" three-dimensional objects in a virtual environment. She plans to write new software for the device so that it can be used for medical applications.

Perfecting the mechanical devices and software needed to simulate the human sense of touch is a challenge that could take decades, but Okamura is eager to conduct some of the basic research. "Human beings, obviously, have amazing tactile sensing ability," she says. "What we've done so far with robots doesn't even come close. There's a lot of work to be done."

Color images available of Professor Okamura and a haptic interface; contact Phil Sneiderman.


Story Source:

Materials provided by Johns Hopkins University. Note: Content may be edited for style and length.


Cite This Page:

Johns Hopkins University. "New Lab Could Help Robots "Feel" More Like Humans." ScienceDaily. ScienceDaily, 28 September 2000. <www.sciencedaily.com/releases/2000/09/000927144349.htm>.
Johns Hopkins University. (2000, September 28). New Lab Could Help Robots "Feel" More Like Humans. ScienceDaily. Retrieved December 3, 2024 from www.sciencedaily.com/releases/2000/09/000927144349.htm
Johns Hopkins University. "New Lab Could Help Robots "Feel" More Like Humans." ScienceDaily. www.sciencedaily.com/releases/2000/09/000927144349.htm (accessed December 3, 2024).

Explore More

from ScienceDaily

RELATED STORIES