Founding of startup Affectiva
The affective computing research carried out by Picard’s team at MIT has not been locked up in the ivory tower. Picard’s team founded an MIT-spin-off called Affectiva in 2009. The startup has successfully commercialized emotion technologies, including Affedex, an automated facial coding platform, and Qsensor, a wearable biometric sensor.
Although Picard noted that making money in the commercial market had never been her goal, she said that having been inundated by so many inquiries from the medical community compelled her to start Affectiva. “If I didn’t help them, I felt as though I were holding up the medical research.”
Picard’s quest to make a computer that understands emotions has also found a partnership with researchers studying autistic children. Autistic kids are known for their inability to read other people’s emotions. They can’t seem to put facial expressions, vocal affects or physiological changes into context. As Picard’s team worked with people with autism, Picard said, “We discovered the autistic people showed totally off-the-chart emotional responses when they were agitated.” She added, “Even scarier was that the sensor recorded an enormous emotional peak just before the autistic person had a seizure.”
Picard and her team ended up teaching caregivers how to deal with autistic people by looking for signs, instead of teaching autistic people how to understand other people.
Picard believes that the automated facial coding platform developed by Affectiva works quite well. “We have the accuracy rate in the range of the upper 90 percent” when it comes to its pattern recognition, she said. “Our tools have been trained and tested not only in the United States but also in China and Brazil.”
Much more challenging, though, is the development of emotion prediction, Picard said.
Emotion is complicated. Humans can be angry or afraid, before the signals even get to the cortex, and before they become aware of what’s happening to them. Picard likes the example of people’s reaction to the sight of a snake. “Sometimes, we are already behaving, such as jumping out of the way of danger when we see a snake, before we become aware of an emotion such as fear,” Picard said.
In that sense, “Reading emotions is equivalent to forecasting the weather. You need to know how the weather changes as you look at the weather through a window,” she explained.
Asked about affective computing’s impact on hardware, Picard said, “As the algorithms [for computers to read emotions] get more complex, they require computing machines that can speed up pattern recognition.”
While improving processing performance, doing it in an embedded system with cost and power constraints will be even more challenging, noted Bier. He said that, to have any practical use, computer vision systems need to fit into equipment that is both smaller and less inexpensive.
, scheduled at September 17 – 20 in Boston, combines the Android Summit, DesignMED, the LED Summit and Sensors in Design events with UBM's Embedded Systems Conference (ESC) Boston. DESIGN East is also co-located with the Embedded Vision Summit
, a technical education forum for engineers centered on embedded vision technology applications.
Design West Keynote
Design West Registration
Embedded Vision Summit
Is vision the next-gen must-have user interface?