Four universities won a $1.2 million grant to develop prosthetics that deliver sensory information to patients and can be controlled by their thoughts. Rice University, the University of Michigan, Drexel University and the University of Maryland will work on the four-year project with funds from the National Science Foundation's Human-Centered Computing program.
Researchers at Rice will build a prosthetic arm that can be controlled by a cap of electrodes that read electrical activity on the scalp using electroencephalography. The EEG information will be combined with real-time data about blood-oxygen levels in the user's frontal lobe using functional near-infrared technology developed by Drexel's brain imaging lab.
The prosthetic will include sensors that gather tactile data from its artificial fingertips and information from the hand about the amount of force it uses in grasping. The data will be fed back to the user via touch pads that vibrate, stretch and squeeze the skin where the prosthesis attaches to the body.
The approach is seen as a more capable alternative to today's interfaces that use muscle contractions on the chest or arm to control a prosthetic. "Long term, we hope prosthetics have the same capabilities as natural limbs," said Marcia O'Malley, a co-principal investigator at Rice, speaking in an online video.
The group previously demonstrated a prosthetic gripper that allowed amputees to correctly perceive and manipulate objects based upon sensory feedback. University of Maryland researchers have demonstrated a technique using EEG signals that allowed test subjects to move a cursor on a computer screen simply by thinking about it.
"What remains is to bring all of it--noninvasive neural decoding, direct brain control and tactile sensory feedback--together into one device," said O'Malley, speaking in a press statement.
"Ideally, [our] tactile or haptic feedback will make it easier for patients to get their prosthetic arms to do exactly what they want them to do," said Patricia Shewokis, a researcher at Drexel.
O'Malley said the new technology is a big leap over existing prosthetic devices which don't allow amputees to feel what they touch. Some prostheses today use force-feedback systems that vibrate--like the vibrate mode on a mobile phone--to provide limited information about objects a prosthetic hand is gripping.
"Often, these vibro-tactile cues aren't very helpful," O'Malley said. "Many times individuals simply rely on visual feedback--watching their prosthesis grasp an object--to infer whether the object is soft or hard, [or] how tightly they are grasping it, [so] there's a lot of room for improvement," she said.
A year ago the Defense Advanced Research Projects Agency (DARPA) awarded Johns Hopkins a $34.5 million grant to create an interface using neural sensors implanted in the brain to control a prosthetic. The four-year project will use an artificial arm with 22-degrees of motion developed at the university.
Better prosthetics has been a focus for DARPA since the start of the Afghanistan and Iraq wars where bombs have maimed many soldiers. The agency's Human-Assisted Neural Devices program aims to let an amputee's thoughts control a mechanical hand. A follow on program on prosthetic arms includes the latest work with Johns Hopkins.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.