Earlier this month, IMS Research (Austin, Texas) issued a press release questioning if Apple and the iPad are falling behind competitors in user interface technologies.
The point of the research company’s commentary, to paraphrase, was: Sure, Apple has changed the game by bringing touch-screen interaction to the masses; but is that all? Shouldn’t Apple be also embracing embedded vision technologies in the next product release?
The industry still wants to know: Where will the battle lines be drawn for the next-generation user interface – beyond touch. Will it be gesture, motion, or voice? What about mental telepathy?
A growing number of FPGA, DSP and processor companies are now betting the future on embedded vision.
Contributing to this movement are: a) the growing processing power (using parallelism) in embedded systems, and b) increasingly sophisticated machine-vision algorithms that let embedded systems not just see but extract information to produce necessary intelligence.
Jeff Bier, president of Berkeley Design Technology, Inc., said, “Thanks to Microsoft’s Kinect (used in Xbox 360), we now have ‘existence proof’ for embedded vision. We now know it works.”
Consumers, moreover, are becoming more familiar with gesture controls, and automotive manufacturers are integrating embedded-vision applications in cars in the cause of driver safety.
Jon Cropley, principal analyst at IMS Research said the market for intelligent automotive camera modules alone was estimated at around $300 million in 2011 and is forecast to grow at an average annual rate of over 30% to 2015.
Meanwhile, the market for intelligent video surveillance devices (devices with embedded analytics) was estimated at about $250 million in 2011 and is forecast for annual growth to 2015 of more than 20%, he added.
Biggest and perhaps most established is the market for industrial machine vision hardware (smart sensors, smart cameras, compact vision systems, and machine vision cameras). Cropley's estimate is around $1.5 billion in 2011 with an average annual rate over 10%.
But from the engineering community’s standpoint, Bier said, “Many design engineers, generally, just don’t think about vision, and they still don’t know what’s possible.”
Embedded vision is in fact a “classic long-tail story,” Bier said. “There are thousands of applications; and its market is extremely diverse.” Bier founded Embedded Vision Alliance, an industry association set up to inspire and empower embedded system designers to use vision technology.
Working with the Embedded Vision Alliance, EE Times put together an image gallery that showcases the latest embedded vision-enabled consumer products.
Never miss a word?
That’s a marketing tagline used by Livescribe, a company that developed a platform consisting of a digital pen, digital paper and software apps.When used with digital paper, Livescribe’s pen -- integrated with an infrared camera and a digital audio recorder -- records a conversation while one participant takes notes on digital paper.
Digital paper consists of numerous small black dots in patterns essentially invisible to the human eye, but detectable by the pen’s camera. It allows a user to replay portions of a recording by tapping on the notes – no matter how messy the scribble. Not a single word is lost. A reporter’s dream; a politician's