MINNEAPOLIS Going far beyond the scalpel, researchers are taping hybrid sensors, wireless technology and video to create tomorrow's surgical instruments.
Carnegie Mellon University has prototyped a video-based control system that automates a form of microscopic eye surgery to create a procedure which could be both faster and more accurate than current methods. Separately, researchers at Johns Hopkins University are trying to fuse optical and electromechanical sensors to enable more accurate surgical instruments.
In the video below, Brian Becker, a Carnegie Mellon PhD student, describes a video-based control system that automates a form of retinal surgery known as laser coagulation. In early tests, the system proved 40 percent faster and 22 percent more accurate than the current approach.
Separate teams of researchers are collaborating on wired surgical instruments that could more effectively estimate their position and orientation using a combination of accelerometers, magnetometers and optical sensors. The work is a partnership between at Johns Hopkins and the Fraunenhofer Institute.
"We believe it's possible to fuse inertial and electromagnetic sensors--this is something of a Holy Grail," in digital surgery, Peter Kazanzides, a chief systems and robotics engineer at Johns Hopkins.
Frauenhofer is building custom electronics boards for the device that will be tested later this month. Ultimately the group hopes to use wireless technology to control the hybrid instruments.
There's plenty of room for innovation in sensors for surgical gear, said Cameron Riviera, a bioengineering professor at CMU. Tracking moving soft tissue in real time during an operation is at the leading edge of digital techniques and will require new sensor technologies, he said.
The researchers discussed their work at the recent conference of the IEEE Engineering in Medicine and Biology Society (EMBC '09).