PORTLAND, Ore. A new mobile robot promises to aid developers of artificial retinas and other visual prosthetics by testing the endless variations of video filters proposed to improve their effectiveness.
Called "Cyclops," for its single video camera "eye," the wheeled robotic emulates the quality of visual scenes enabled by current low-resolution implants, performing navigation tasks that may allow engineers to perfect video filter algorithms needed to help artificial retina recipients.
"We believe we have a better way to test new video filtering techniques for visual prostheses," said professor Wolfgang Fink of the University of Arizona and a visiting associate in physics at Caltech. "Our Cyclops robot can test our image-processing algorithms at the same resolution as an artificial retina."
|The Cyclops robot is designed as a surrogate for the blind in testing visual prostheses. Credit: Caltech/Wolfgang Fink, Mark Tarbell|
The main problem with visual prostheses is low resolution, but that can be partially compensated for by choosing the right image processing routine. For instance, if a patient is trying to find a doorway in a darkened hallway, there may not be enough contrast to perceive a dark doorway compared to the almost-as-dark wall next to it. However, if an image processing routine measured all the pixel values and assigned pure white to the lightest pixels of the wall and jet black to the darkest pixels representing the doorway, then a patient could perceive it more easily.
Fink and and Caltech visiting scientist Mark Tarbell already have funding from the National Science Foundation to create the video filters, but testing remains a problem. The reason is that patients with implants are already overloaded with testing regimes.
To reduce their burden, the researchers turned to test subjects with normal sight. The idea was to reduce the video camera's resolution to match that of the retinal implant. The technique failed,, however, since sighted test subjects were able to use image-enhancing routines built into their brains.
Cyclops, on the other hand, has only the visual processing routines that Fink and Tarbell programmed into it. By establishing a baseline for how long it takes the robot to find a doorway using full video camera resolution, they reduced resolution to match an artificial retina. They then measured the effectiveness of their image processing routines compared to a baseline.
After trying out hundreds of variations of imaging processing routines on the Cyclops robot performing a variety of navigation tasks, the researchers were able to identify a few that consistently improved the robot's ability to perform simple navigation tasks. Once identified, the most promising routines were sent to artificial retina makers who test them on implant recipients.