I have been very impressed with the advances made in very capable low-power SoC's due to the cell phone industry, but it looks like I have only been paying attention to half of the story. One platform that I have been working does sensor fusion the old fashioned way with discrete sensors and fusion performed in the main CPU. This creates resource contention and, from the sound of it, uses more power than a sensor hub approach. Is this approach used in current cell phone designs, or is it something that is coming down the road?
@Junko: many MCU vendors incluidng ST & Freescale are already offering sension fusion algorithms & code for no additional cost. Isn't this a threat to the likes of Hillcrest labs, Sensor Platforms, etc?
Cell phones are no more only cell phones now, due to Androids and IOS, there are many varied applications are making use of various sensors on the cellular devices. Many important applications are also there that are dealing with medical electronics and measurement. A common sensor interface (I mean hub here), will surely provide better handling of these sensors including the possibilities of standard calibrations. I frequently use Audio Analyser applications, but these applications give different results on different devices due to lack of common standard calibration capabilities.
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used in Orion Spacecraft, part of NASA's Mars mission. Live radio show and live chat. Get your questions ready.
Brought to you by