@Junko: many MCU vendors incluidng ST & Freescale are already offering sension fusion algorithms & code for no additional cost. Isn't this a threat to the likes of Hillcrest labs, Sensor Platforms, etc?
Cell phones are no more only cell phones now, due to Androids and IOS, there are many varied applications are making use of various sensors on the cellular devices. Many important applications are also there that are dealing with medical electronics and measurement. A common sensor interface (I mean hub here), will surely provide better handling of these sensors including the possibilities of standard calibrations. I frequently use Audio Analyser applications, but these applications give different results on different devices due to lack of common standard calibration capabilities.
I have been very impressed with the advances made in very capable low-power SoC's due to the cell phone industry, but it looks like I have only been paying attention to half of the story. One platform that I have been working does sensor fusion the old fashioned way with discrete sensors and fusion performed in the main CPU. This creates resource contention and, from the sound of it, uses more power than a sensor hub approach. Is this approach used in current cell phone designs, or is it something that is coming down the road?
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.