Engineer Gives Robots A New Way To 'See'
- Date:
- March 19, 1999
- Source:
- Johns Hopkins University
- Summary:
- A Johns Hopkins electrical engineer has developed a robotic vision system on a microchip, an alternative to the conventional approach that allows a moving robot to react to obstacles much more quickly.
- Share:
Computational Sensors Could Steer a Car or Guide Surgical Tools
A Johns Hopkins University electrical engineer has developed a new robotic vision system on a microchip that enables a toy car to follow a line around a test track, avoiding obstacles along the way. In the near future, the same technology may allow a robotic surgical tool to locate and operate on a clogged artery in a beating human heart.
The key to this system, says Ralph Etienne-Cummings, is a single chip that combines several critical functions: It performs analog and digital processing, extracts relevant information, makes decisions and communicates them to the robot. If the system in the toy car "sees" an obstacle ahead, it directs the vehicle to move around it. If the chip is used in a surveillance system, the camera can follow a moving target.
Because the decision making is done on the microchip itself, not on a separate computer, the response time is much faster than other robotic vision systems, the researcher says. Also, he says, this system is much smaller, uses less power and can be mounted on mobile machines, including law-enforcement microrobots, autonomous flying machines and extra-terrestrial rovers.
"The idea of putting electronic sensing and processing in the same place is called computational sensing," explains Etienne-Cummings, an assistant professor of electrical and computer engineering at Johns Hopkins. "It was coined less than 10 years ago by the people who started this new line of research. Our goal is to revolutionize robotic vision or robotics in general. It hasn't happened yet, but we're making progress."
In a paper presented at the Conference on Intelligent Robots and Systems in Victoria, British Columbia, Etienne-Cummings outlined the advantages of this technology and described his success in using it in a toy car that maneuvered around a track without help from a human controller. The project marked one of the first times a biologically inspired computational visual sensor has been used to guide a robotic vehicle around obstacles as it followed a simulated road.
The technology used in this test has many potential applications, the engineer believes. Beyond their role in autonomous navigation and medical systems, computational sensors could allow robots to identify and pick up parts in manufacturing plants. In a video-conferencing system, a computational sensor could enable the camera to "lock on" to a speaker who wished to move around the room, the researcher says.
By processing and reacting to light as soon as it hits the system, these sensors take a cue from Mother Nature. "This resembles the early type of processing that takes place in a rabbit's eye or a frog's," Etienne-Cummings says. "The animal sees a shape moving up ahead. If the shape is small enough, it may be food, so the animal moves toward it. But if it's too large, it might be a threat to its safety, so the animal runs away. These reactions happen very, very quickly, in the earliest moments of biological processing."
When he designs computational sensors, the Johns Hopkins researcher is not trying make electronic versions of biological cells and brain tissue. "I'm just trying to mimic their function," he says, "using the best electronic tools I can find."
In the toy car that Etienne-Cummings adapted, two sensors are mounted as "eyes" on the front of the vehicle. The microchips force the car to follow a line detected by the sensors, unless an obstacle appears in its path. To the chips, avoiding a crash takes priority over following the line, so they steer the car away from the obstacle. The system also "remembers" how it turned to avoid the obstacle so that it can steer the car back to the line to resume its original course.
Etienne-Cummings has also begun working with Johns Hopkins biomedical engineering researchers who are creating computer models of the heart. He hopes to use computational sensor technology to enable a robot arm to keep pace with a beating heart. If this technology is perfected, surgeons of the future may be able to use the robot to clear a blocked c
p>Story Source:
Materials provided by Johns Hopkins University. Note: Content may be edited for style and length.
Cite This Page: