I think Google hasn't been paying attention. User experience has been defined for the google glass already. Automated 3D mapping combined with motion interaction is the largest missing aspect of the user experience. This has been observable in the VJ community for years now with many apps being developed in Unity 3D meaning they're ready to go in Android and iOS. Near eye transparent displays and projection work very similarly which makes me think Google Glass and Project Tango should be apart of the same project. Lastly Google doesn't seem to understand you need to be able to remove the 3D sensor from the display itself so mapping and motion sensing functions can be conrurrently run. More functionality of the display and sensor are unlocked if you can reposition them relative to each otheri.e. simulated surface touch with a pico projector.
As we unveil EE Times’ 2015 Silicon 60 list, journalist & Silicon 60 researcher Peter Clarke hosts a conversation on startups in the electronics industry. Panelists Dan Armbrust (investment firm Silicon Catalyst), Andrew Kau (venture capital firm Walden International), and Stan Boland (successful serial entrepreneur, former CEO of Neul, Icera) join in the live debate.