NEWPORT, Rhode Island -- Next week, a "mind reading" technology will be demonstrated at the Association for Computing Machinery (ACM) Symposium on User Interface Software and Technology (UIST 2007, Oct. 7 to 10). Using functional near-infrared spectroscopy (fNIRS), Tufts University researchers have successfully crafted machine learning algorithms that deduce users' "stress levels," while performing tasks with varying levels of mental workload (from bored to overwhelmed) and adjust the man-machine interface to match.
"We want to gather information from the brain to improve computer user interfaces, but we are not trying to help the disabled," said Tuft professor Robert Jacob. "Instead, we are trying to improve the user interface for normal computer users."
Functional near-infrared spectroscopy is a cutting-edge technology currently in clinical trials to detect tumors through the skin. But the Tuft researchers are repurposing the technology to enhance the man-machine interface for critical computer users, such as air traffic controllers, who need to track and manage stress levels to insure public safety. To that end, Jacob has enlisted Tuft biomedical engineering professor Sergio Fantini, who was already studying functional near-infrared spectroscopy (fNIRS) for tumor detection, and his human-computer interaction (HCI) group.
"The technology works in a manner similar to shining a light through your fingers. Here, infrared laser diodes shine light that penetrates the brain and is either absorbed or reflected depending on how oxygenated your blood is; the more oxygenated the blood is, the more activity is going on in that region of the brain," said Jacob.
The infrared sensors are mounted on a headband with eight laser diodes sending near-infrared light through the forehead to a depth of two to three centimeters (about an inch), where it scatters inside the brain's frontal lobe. Light passes through deoxygenated hemoglobin in the blood but is blocked by oxygenated hemoglobin. Since oxygenation levels indicate brain activity, the amount of scattered light detected by the two infrared sensors on the headband detect stress levels on a scale ranging from bored (low stress) to overwhelmed (high stress).
A perceptron neural network with 16 inputs, one hidden layer, and up to five outputs was employed to associate the sensed blood oxygenation levels with the stress levels of specific users. In the tests, different patterns of Rubik's cubes were presented to the users, giving them just nine seconds to identify how many colors are present on each cube. By starting with just two colors and working up from there, the researchers were able to teach the neural network to recognize stress levels.
To use the data of sensed stress levels to improve the computer user interface, the researchers sought to match the modest capabilities of their sensors to the level of user interface enhancement provided. Unlike trying to help the disabled by giving them mind control over a precise aspect of the computer user interface (such as controlling the mouse), Jacob's team sought more modest user interface enhancements that match their gathered information about the user.
"Since our information about the brain is very modest today, we wanted to respond with correspondingly modest, but useful, improvements in the user interface as a result of monitoring that information," said Jacob.
For instance, air traffic centers today have their work load monitored by software that tries to divide up the tasks equally among the on-duty controllers. But by monitoring the stress levels in the brain of each controller, a more intelligent assignment of planes could be made based on the current stress levels of on-duty controllers.
"So far our work is very preliminary: we are just starting to take measurements and trying to figure out what we can get out of the brain and how we can use it in interfaces; we are no where near being deployed at an air traffic control center," said Jacob. "However, we imagine an air traffic control center where assignments of the next plane would go to whomever is most bored, while those who are currently overwhelmed will not be assigned any new planes until their stress level lowers."
Jacob's human-computer interaction program is funded by a $445,000 grant from the National Science Foundation (NSF). Also working on the project are Leanne Hirshfield and Erin Solovey, graduate researchers in Tufts's School of Engineering.