The author has to be congratulated for an excellent article on sensor fusion and its future directions.
I also would like to call his attention (since he is local to Silicon Valley) to the event on a closely related topic being held on a Saturday this month (29th):
Next Generation Circuit & Systems, Communication and Sensor Technologies in Mobile Devices
Hello Mr. Ristic,
I would like to appreciate the succinct description of sensor fusion provided in the aforesaid article. But, while working on sensor fusion for the the estimation of velocity of a body frame, I came across some real world challenges, for which I seek advice.
*When the mobile device is mounted on a moving body frame, there are 3 coordinate systems, the geomagnetic earth frame, the frame of device mount and the frame of motion of the body itself. In this case, how can we project, the acceleration vector on to the body frame only?
*Furthermore, from inertial navigation, we can use gyroscopes to lock in inertial vectors in the reference frame. But, how do we account for centripetal forces in this case?
It would be really helpful if the doubts enumerated above could be clarified.
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.