New! Sign up for our free email newsletter.
Science News
from research organizations

Scientists Create Robot Surrogate For Blind Persons In Testing Visual Prostheses

Date:
October 20, 2009
Source:
California Institute of Technology
Summary:
Scientists have created a remote-controlled robot that is able to simulate the "visual" experience of a blind person who has been implanted with a visual prosthesis, such as an artificial retina. An artificial retina consists of a silicon chip studded with a varying number of electrodes that directly stimulate retinal nerve cells. It is hoped that this approach may one day give blind persons the freedom of independent mobility.
Share:
FULL STORY

Scientists at the California Institute of Technology (Caltech) have created a remote-controlled robot that is able to simulate the "visual" experience of a blind person who has been implanted with a visual prosthesis, such as an artificial retina. An artificial retina consists of a silicon chip studded with a varying number of electrodes that directly stimulate retinal nerve cells. It is hoped that this approach may one day give blind persons the freedom of independent mobility.

The robot—or, rather, the mobile robotic platform, or rover—is called CYCLOPS. It is the first such device to emulate what the blind can see with an implant, says Wolfgang Fink, a visiting associate in physics at Caltech and the Edward and Maria Keonjian Distinguished Professor in Microelectronics at the University of Arizona. Its development and potential uses are described in a paper recently published online in the journal Computer Methods and Programs in Biomedicine.

An artificial retina, also known as a retinal prosthesis, may use either an internal or external miniature camera to capture images. The captured images then are processed and passed along to the implanted silicon chip's electrode array. (Ongoing work at Caltech's Visual and Autonomous Exploration Systems Research Laboratory by Fink and Caltech visiting scientist Mark Tarbell has focused on the creation and refinement of these image-processing algorithms.) The chip directly stimulates the eye's functional retinal ganglion cells, which carry the image information to the vision centers in the brain.

CYCLOPS fills a void in the process of testing visual prostheses, explains Fink. "How do you approximate what the blind can see with the implant so you can figure out how to make it better?" he asks.

One way is to test potential enhancements on a blind person who has been given an artificial retina. And, indeed, the retinal implant research team does this often, and extensively. But few people worldwide have been implanted with retinal prostheses, and there is only so much testing they can be asked to endure.

Another way is to give sighted people devices that downgrade their vision to what might be expected using artificial vision prostheses. And this, too, is often done. But it's a less-than-ideal solution since the brain of a sighted person is adept at taking poor-quality images and processing them in various ways, adding detail as needed. This processing is what allows most people to see in dim light, for example, or through smoke or fog.

"A sighted person's objectivity is impaired," Fink says. "They may not be able to get to the level of what a blind person truly experiences."

Enter one more possible solution: CYCLOPS. "We can use CYCLOPS in lieu of a blind person," Fink explains. "We can equip it with a camera just like what a blind person would have with a retinal prosthesis, and that puts us in the unique position of being able to dictate what the robot receives as visual input."

Now, if scientists want to see how much better the resolution is when a retinal prosthesis has an array of 50 pixels as opposed to 16 pixels, they can try both out on CYCLOPS. They might do this by asking the robot to follow a black line down a white-tiled hallway, or seeing if it can find—and enter—a darkened doorway.

"We're not quite at that stage yet," Fink cautions, referring to such independent maneuvering.

CYCLOPS's camera is gimballed, which means it can emulate left-to-right and up-and-down head movements. The input from the camera runs through the onboard computing platform, which does real-time image processing. For now, however, the platform itself is moved around remotely, via a joystick. "The platform can be operated from anywhere in the world, through its wireless Internet connection," says Tarbell.

"We have the image-processing algorithms running locally on the robot's platform—but we have to get it to the point where it has complete control of its own responses," Fink says.

Once that's done, he adds, "we can run many, many tests without bothering the blind prosthesis carriers."

Among the things they hope to learn from such testing is how to enhance a workplace or living environment to make it more accessible to a blind person with a particular vision implant. If CYCLOPS can use computer-enhanced images from a 50-pixel array to make its way safely through a room with a chair in one corner, a sofa along the wall, and a coffee table in the middle, then there is a good chance that a blind person with a 50-pixel retinal prosthesis would be able to do the same.

The results of tests on the CYCLOPS robot should also help researchers determine whether a particular version of a prosthesis, say, or its onboard image-processing software, are even worth testing in blind persons. "We'll be coming in with a much more educated initial starting point, after which we'll be able to see how blind people work with these implants," Fink notes.

And the implants need to work well. After all, Fink points out, "Blind people using a cane or a canine unit can move around impressively well. For an implant to be useful, it has to have the implicit promise that it will surpass these tools. The ultimate promise—the hope—is that we instill in them such useful vision that they can attain independent mobility, can recognize people, and can go about their daily lives."

The work done in the paper by Fink and Tarbell, "CYCLOPS: A mobile robotic platform for testing and validating image processing and autonomous navigation algorithms in support of artificial vision prostheses," was supported by a grant from the National Science Foundation. Fink and Tarbell have filed a provisional patent on the technology on behalf of Caltech.


Story Source:

Materials provided by California Institute of Technology. Note: Content may be edited for style and length.


Cite This Page:

California Institute of Technology. "Scientists Create Robot Surrogate For Blind Persons In Testing Visual Prostheses." ScienceDaily. ScienceDaily, 20 October 2009. <www.sciencedaily.com/releases/2009/10/091019163025.htm>.
California Institute of Technology. (2009, October 20). Scientists Create Robot Surrogate For Blind Persons In Testing Visual Prostheses. ScienceDaily. Retrieved December 22, 2024 from www.sciencedaily.com/releases/2009/10/091019163025.htm
California Institute of Technology. "Scientists Create Robot Surrogate For Blind Persons In Testing Visual Prostheses." ScienceDaily. www.sciencedaily.com/releases/2009/10/091019163025.htm (accessed December 22, 2024).

Explore More

from ScienceDaily

RELATED STORIES