3-D gesture recognition is going mainstream--jumping from the consumer market for gaming to the most ubiquitous of user interfaces (UI), the TV remote. Programmers should check out the new 3-D gesture recognition APIs for Android 2.3 Gingerbread. I predict @NextGenLog that 3-D gestures like shake-to-undo will become standard for UIs within two years.
During the mid-term election, I noticed in the CNN's "John King" show, they had a 3-D interactive statistic graphs displayed literally in the thin air and he can control the graphs using his gesture, I thought that was cool, now I know how they did it.
Gestures are great when single user interfaces with machine; what happens when three or four users use the same machine and try to overpower others with firmer gestures to change channels, increase volume, etc...It's a fun UI, but useful?
easy to solve... the point cloud provided by 3D sensors can easily be processed to figure out which user is in control even if they overlap position. And there can be a simple gesture to 'take control' of the remote. At the end of the day however, you can't stop people from fighting for the remote even today ;)
The cool thing about using the gesture recognition APIs built into iOX and Android, is that the devices already have the hardware--accelerometer, gyroscope, magnetometer and barometer. Those MEMS sensors can perform all the location-based functions for which they were intended, as well as giving app writers access to 3-D gestures for free!
Gaming solutions I suspect will evolve into hand held controllers, but perhaps more life-like and prop like, with buttons for low latency event tracking, in parallel with 3D environmental sensors. The 3D sensor will get on a Moore's law type of improvement path and get you XGA resolution of the scene with less than 1mm precision, in 60fps, etc... Kinect is a coarse (but brilliant) hack compared to what we will see in 3 to 5 years in this space.
Very interesting article. I think gesture recognition as an input method is optimal because it is basically the gadgets adapting to the human and not the other way around.
Once a user has experienced the ease of controlling with his hand by using gestures it's hard to go back to mouses and keyboards.
Of course, there will always be places where a touch screen isn't viable or even needed so keyboards, buttons and mouses will not disappear.
I think this research will not only help the gaming industry but real time training sessions like for learning driving, learning sports, learning dance etc. This research will bring down the cost of simulators.
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used in Orion Spacecraft, part of NASA's Mars mission. Live radio show and live chat. Get your questions ready.
Brought to you by