"Computer vision is the next big thing for embedded systems, making them safer, more responsive to people, more efficient and more perceptive," said Jeff Bier, founder of the Embedded Vision Alliance and session chair of the upcoming Embedded Vision Summit.
In fact, nearly every category of consumer, automotive and industrial application is being enhanced today by embedded vision capabilities. Learn how to add the latest pattern-recognition capabilities into your embedded vision application at the Embedded Vision Summit on April 25, co-located with DESIGN West 2013.
Embedded vision started out as an esoteric technology that was expensive to implement, requiring a team of domain experts with deep experience in the black-magic of pattern recognition to get it right. "NASA pioneered embedded vision for space exploration and the military has been using it for target recognition for decades," said Bier. "But until now it has been a niche technology in industry, such as for parts inspection. Now, the sensors and processors to perform the tens of billions of operations per second necessary to process millions of pixels are much more cost effective, enabling computer vision to be added to almost any embedded system."
Today a wide variety of applications are adding embedded vision capabilities, from automotive systems that avoid collisions by warning drivers, to security systems that detect nervous people acting suspicious, to smartphones that allow users to control video playback using their eye movements.
Using 3-D sensors such as Primesense's Carmine (licensed and popularized by Microsoft as the Kinect for Xbox) has brought down the expense of embedded vision solutions by estimating the distance from the sensor to objects in the scene (pictured). At the Embedded Vision Summit, Texas Instrument's Goksel Dedeoglu will present techniques for low-cost implementation of stereoscopic 3-D vision. SOURCE: TI
"My favorite embedded vision application comes from Affectiva, which uses webcams to detect the emotions of a user," said Bier. "Imagine educational apps that pace learning by detecting frustration levels, or toys that stimulate a child's intellect when they detect boredom. The possibilities are endless, now that embedded vision technology is cheap enough for almost any app."
In fact, as more and more competitors add vision-based pattern-recognition algorithms, a new era of applications are emerging which require that embedded vision be integrated in order to succeed. Unfortunately, many engineers do not realize how useful computer vision can be, nor are they aware of the easy-to-use open-source embedded-vision algorithms that are available to streamline the development process.
"The biggest problem is that engineers are not aware of how useful and relatively easy it is to add computer vision to their embedded systems," said Bier.
The application of vision is indeed becoming more prevalent in embedded systems. It is fascinating to see how vision is being applied to problems in the industrial and consumer spaces that have traditionally been handled by other, more cumbersome solutions. The replacement of gaming controllers with stereo vision based solution is one example among many. An example of one approach to adding custom embedded vision to a product is outlined in the following TI white paper: http://www.ti.com/lit/wp/spry232/spry232.pdf
What is exciting is the growth of both the processing capacity and the sensor resolutions. Given that vision requires huge processing resources and the proliferation of quad / 8 core processors there is enough horsepower availible to time wise effectively process live images. Couple that with the advent of cheap, high resolution cameras you have a nexus of opportunity to provide "real time" live vision processing for the masses. Very exciting times indeed.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.