I understand the point about the regulators' "shift in focus," but I guess my point is, this doesn't change reality. Or said another way, just because regulators at NHTSA are concentrating only on V2V at present, that doesn't mean that this is because V2V is sufficient for self-driving. It's just an artifact of what NHTSA might have determined to be manageable, short term. It's a lot easier to "stick" the expense of this vehicle evolution on the auto makers (V2V is their responsibility and cost), than to burden the DOT with the cost of upgrading the infrastructure. That's all. I wouldn't infer anything more from this shift.
Reality is that for the true self-driving car, one needs both types of comms. If we concentrate first on V2V, fine and good, and until we spend quality time with V2I, we'll not get that self driving car.
Just to be clear, I do understand all the implications as to how both V2I and V2V are needed for the ultimate future of self-driving cars that may not be here for a long time -- yet.
But the purpose of this article is to point out a shift in focus in temrs of the market and regulators.
Since the early 2000s, the U.S. Dept. of Transportation has worked closely with the major automotive manufacturers and other state and private sector stakeholders on V2I and V2V.
While their effort initially focused on V2I, the department's more recent focus, through the NHTSA, has been on V2V safety communications, with a secondary focus on V2I and vehicle-to-mobile device applications.
That I see as a larger trend emerging in the segment.
You forgot the most important design constraint on a self driving car, weather. It has to be able to operate year-round at all seasons in all weather conditions that a person can drive a car in, or its dead-on-arival for most areas where people live. So forget about standard speeds, it has to adjust its speed for the current conditions. Optically observing lines painted on the road can be useful, but not required. It must operate both properly and safely when the road is covered in snow and none of those lines are visible. Fog, rain, snow, wind, etc., whatever a person can drive in, it must be able to drive in too.
@BarrySweezey, thanks for commenting. While what you pointed out here is true, I'd have to say that we are hardly there yet -- at least at this point and time. Google car's demo has been more or less a "show." What we need is a much more robust testing and verifications.
What happens if a newspaper blows on the windshield of your car, while you're driving, obscuring your vision completely? What happens if someone from a passing car sprays paint all over your windshield? What happens if you turn your head because your child is throwing things around in the back seat? Same thing. You need info from the environment to be able to drive.
Sorry, I strongly disagree. Connectedness is essential for a real solution, one that works without limiting yourself only to carefully pre-planned routes and with a driver ready to take over instantly. The Google car is far from a complete solution:
How would the Google solution work when a road has been milled, or when it has just been repaved and lane markers not yet painted? How would the Google solution adjust the route when that bridge is blocked for repair or for an accident? The Google car assumes the speed limit on a road based on a database. How does it know when that speed limit has been changed, for some short-term reason? These, and the examples in the article above, prevent the existing Goggle solution from being complete. The human driver cannot be rerading the paper or taking a nap. To resolve the limitations, you need improved info from the environment, info that manual driving does provide to the driver.
The simple fact is, driving is never an "autonomous" experience. The driver is taking in all manner of inputs, all of the time, from the infrastructure and from other vehicles. The complete solution for driverless cars cannot ignore this. Instead, the solution has to replace or emulate these comms.
The Google cars have driven over 500,000 miles on public highways safely. They seem to have dealt with lane markers, road edges, traffic signals, and the intentions of adjacent drivers. They've gotten to their destinations without real-time information abut their entire route.
"Connectedness" will be useful, but it's not necessary.
For any real move to automated driving V2I has to be the backbone. V2V can help but IMO should always be secondary to V2I with target separation over 50ft or so.
Fully autonomous vehicles are a flash in the pan IMO. I would be extremely concerned to see multiple manufacturers stand alone implementation of autonomy driving on the same road at the same time. I've seen no comments on what the failure mode of Lidar/radar and the various sensors used to provide the data on which decisions are being made in each vehicle. Even if autonomy results in a reduction in accidents, there still will be some I assume, and the degradation of information fed into V2V network might be questionable.
V2I moves the responsibility of major sensor data to a fully redundant and reliable infrastructure with little chance of being compromised when accidents do occur. Scaling the sensor and communication network then becomes a public cost, hopefully with the correct maintenance and calibration included. Infrastructure based sensors should be a fraction of the cost of mobile sensors overall, and the computing power required is more easily planned as infrastructure.
Autonomous solutions for parking, multi-point turns and vehicles lane and separation verification would seem like good targets for the auto companies and close proximity low speed V2V. However autonomy of thousands of vehicles in a local region (a few blocks) IMO will be a big fail without V2I.
As a last comment on the need for V2I, this provides the Holy Grail for traffic route planning. With the proper data available, all streets and freeway lanes can be used and traffic dynamically allocated as conditions change. Route mapping now becomes an endpoint to endpoint decision with the infrastructure. You don't get to choose the route way points, and all roads become toll enabled
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used in Orion Spacecraft, part of NASA's Mars mission. Live radio show and live chat. Get your questions ready.
Brought to you by