@MP Divakar: Thanks for your comment! We at Hillcrest agree that in some applications it is most effective to include the fusion algorithms in the remote control. Our Freespace® MotionEngine™ software is modular so you can implement full processing on the remote or just sensor management on the remote with fusion, calibration, etc on the host (e.g. TV SoC). If full motion performance with the lowest power and cost is desired, then more processing on the host is generally a great solution. In the end though, the optimal choice depends on the particular application and customer design objectives.
@R. Colin Johnson: interesting article... Lucien's statement that "...have to transmit raw sensor data from the MEMS chips" passes the burden (not to mention process steps such as filtering and amplifying received signals) to the processors in the TV! I thought the most efficient use of gesture control data (accelerometer & gyro outputs) was thru fusion at the sensor interface itself.
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.