Picture this: A roadway 3 lanes wide, every vehicle has radar. What prevents inter-vehicle interference? Do all the adjacent vehicles have time-domain sync with each other to prevent registering the pulse reflection from the vehicle in the next lane?
The European funding project MOSARIM (MOre Safety for All by Radar Interference Mitigation) started in January 2010 with the main objectives to investigate possible automotive radar interference mechanisms by both simulation and real-world road-tests and assess possible countermeasure and mitigation techniques in general guidelines and recommendations.
While there will always be incentive to simplify these systems due to cost pressure (especially as ADAS becomes a mainstream, high-volume technology), I believe that the most effective systems will combine multiple sensor types -- such as vision plus radar. And I think that when it comes to saving lives, we want the most effective systems. Over 1 million people die annually worldwide from automobile accidents.
Another factor in favor of using vision in these applications is that a vision system can be thought of as a "software-defined sensor", which can be adapted to mulitple purposes. For example, Mercedes is using a camera and embedded vision system to scan the road surface and adjust the car's suspension in real time for each bump in the road, resulting in a dramatic improvement in ride comfort. See http://bit.ly/LUvH42 for a review of this technology.
For those who want to learn how such systems are built, there are still a few seats available at the Embedded Vision Summit on October 2nd in the Boston area, where we'll have a full day of presentations and demos on embedded vision applications, algorithms, design techniques, and technology. See http://bit.ly/1d3xTrK for details.
In fact, I do understand that this is not an either or question. And yet, in talking about this with several participants at the conference here, I realized that there are many different shades of radars and vision technologies.
Carmakers can choose to use vision sensors integrated with more smarts and intelligence while adopting a lighter version of radar system. Or, they can pay more for the heavier- duty radar system and add a much more straightforward image sensor (sans too much intelligence). There seems to be a growing options for carmakers.
@JeffBier, thanks for the URL! Thinking of a vision system as a "software-defined sensor" is an intriguing idea. As more intelligence and smarts is getting integrated into the vision system, that seems to be the trend...
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used in Orion Spacecraft, part of NASA's Mars mission. Live radio show and live chat. Get your questions ready.
Brought to you by