It has been a long-awaited dream for developers of audio speech analyser code to get help from lip-reading cues; these are a big help in improving accuracy of speech-to-text. Only now are we getting ubiquitous cameras and the CPU power to clarify what you are saying to your device.
Very important for low-cost computers for non-literate societies.
In my opinion Embedded vision is a dangerous concept to come as a UI. Because you can do multitasking while listening music with your headphones, while taking on your microphone or the mobile but it is almost impossible if your eye gets engaged in something.
Is the first time I hear the term embedded-vision. I think is appropriate. It really is impressive the kind of app's that are coming out. Never thought it was possible to diagnose if you had a little too much alcohol to drink just by a camera scanning your eye. Wow! wait a minute... this app in the iphone and the iphone connected with the car through perhaps Bluetooth... then the iphone can stop the car from starting if the eyes show that the party was really hard. I think this could save lives, don't you?
I look forward to putting on a headset that will block my vision with displays that provide me a superset of what I could see without the headset. In several sci fi shows/movies, these have been imagined in various ways. Whether it's as "simple" as having a high def, heads up display in your head, giving you quantitative data on what you are seeing through the eyes of the headset. This would include Automatic IR/visual switching, etc. depending on the enviroment. Automatic zoom and macro capability. (that's right, fellow plus 40ers, no readers necessary!)
These applications have been developing for some time, even if not strictly under the name of "embedded vision."
What I'm most intrigued about is applying these techniques to privately owned medical devices, with optional communication to your doctor's office. Any number of tests and diagnoses should be doable this way, and potentially more reliable than current methods.
Another application is self-driving cars. Just yesterday, I commented to my wife how in a few years, we won't believe how people could ever have been trusted to drive manually. How reckless of us!
The more I learn about this "embedded vision" thing, it is fascinating. As Jeff Bier said during the interview, "We have only scratched the surface," as far as the embedded vision applications are concerned.
What's your favorite "embedded vision" product?
As we unveil EE Times’ 2015 Silicon 60 list, journalist & Silicon 60 researcher Peter Clarke hosts a conversation on startups in the electronics industry. Panelists Dan Armbrust (investment firm Silicon Catalyst), Andrew Kau (venture capital firm Walden International), and Stan Boland (successful serial entrepreneur, former CEO of Neul, Icera) join in the live debate.