I think Google hasn't been paying attention. User experience has been defined for the google glass already. Automated 3D mapping combined with motion interaction is the largest missing aspect of the user experience. This has been observable in the VJ community for years now with many apps being developed in Unity 3D meaning they're ready to go in Android and iOS. Near eye transparent displays and projection work very similarly which makes me think Google Glass and Project Tango should be apart of the same project. Lastly Google doesn't seem to understand you need to be able to remove the 3D sensor from the display itself so mapping and motion sensing functions can be conrurrently run. More functionality of the display and sensor are unlocked if you can reposition them relative to each otheri.e. simulated surface touch with a pico projector.
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.