Hi, Duane. You would think that...but the legal framework on the traffic rules on the road isn't something to be taken lightly. Details might be "ironed out" AFTER the lawsuit, but first, you still need to put the legal framework in place no matter how imperfect might be. None of the autonomous cars can actually drive on the road unless certain regulations
are passed by each state.
So the answer to the problems you pose is a world of fully autonomous vehicles? I have so many layers of objections to this potential solution I don't know where to begin. I don't deny that engineers have the capabilities to produce solutions to problems and that there is a huge potential market for such capability assuming the technological issues can be solved. But, would you trust your life, the lives of your family and the lives of others to a foreign corporation? Having spent the last 30 years of my life as a practicing engineer and seeing the limited benefit of sophisticated software (remember the blue screen of death), I am doubtful we can create a solution that is better than what we have now anytime soon.
Just a taste of the problem. Right now, human drivers decide the speed limit. Often times that speed limit is different than the posted speed limit. There are many reasons for this but software would need to decipher why in every situation whereas a human can, at an instant, completely (mostly anyway) understand it and react. How would software understand this context and how would different manufacturers ensure that their interpretations are consistent with each another. How would fault be assigned in an error situation. Are we to establish a body, similar to the NTSB, to go and investigate the cause of every accident? Who would bear the brunt of such costs; higher taxes?
Many of my concerns surround the human element which hasn't been able to evolve as fast as technology. Like the situation concerning "drones", the public should spend much more time on this subject debating the merits of the whys and the shoulds in addition to the "I cans". There are so many examples of technology reaching the marketplace before the public has a chance to ponder the impact of such technology. Fully Autonomous Vehicles, on the other hand, is a solution that has plenty of time for public consumtion and digestion prior to implementation and we should use that time to design and discuss a universe (if possible) for mass use and adoption. And this discussion should - no must - occur way before we have the ability to implement it.
You ask about "someday all cars being autonomous". Well, maybe, but pedestrians, cyclists, animals (both tame - think horses - and wild) use the roads too. so there will always be some non-computer-controlled devices on the road (except, perhaps for motorways). And if someone gets injured there will be a desire to apportion blame.
As I was cycling to work this morning I was wondering how the Google car would cope to the wildly-varying standards of other road users.
Or, here's another take, zchrish. Although driving SHOULD be a serious responsibility, society has far too long abdigated its responsibility to protect itself. We have allowed poorly trained incompetents, and the sleep-deprived or substance-impaired to drive, with impunity. Statistics show that human error is far too commonplace and frequently deadly, and it's high time society did something about this.
Driving is private transportation, not different from riding a horse, or horse-drawn carriage, or even walking. No one would consider those activites to be a "privilege," granted to us graciously by some benevolent monarch. Instead, driving is the modern form of private transportation, which people should expect to be able to do, except in the most extenuating circumstances.
It is we, the people, who have put (not terribly successful) limits on who can drive, because it is we, the people, who have deemed that cars are very fast and dangerous, in the wrong hands. So if there's any "privilege" involved at all, it is essentially a "privilege" given to us by OURSELVES, through our appointed proxies.
Just as we, the people have required that our neighbor take a driving test before venturing out in the neighborhood and endangering others, we can also finally realize that people are still fallible, even at their best. It becomes incumbent upon us, the people, to take the reponsibility for improving what is clearly an unacceptable state of affairs, given the modern technology we have available to us now, to reduce our vulnerability to the fallibility of others.
1. Humans don't change; need to factor in the lowest common denominator
2. Hackers will find a way to mess up either my car, someone else's or whole groups of drivers
3. Persons of malicious intent will re-program the vehicle to suit their own aims
4. Person in accidents could alter the driving log to advance their own agenda
5. Driving is a privilege and a responsibility that should be held to a higher standard than that of robotics. This isn't an area that society should abdigate our role as responsible stewards of the road
I'm thinking that for driver assistance systems, such as yaw control and ABS, or even the radar collision avoidance system Mercedes has, the legal framework should not be so different from what we have today. Because the driver is still supposed to be in control.
So I see two scenarios, as I described before. If the system was not quite effective ENOUGH to prevent the accident, no difference from purely manual driving. ABS helps, for example, but it isn't magic. If ABS didn't quite do enough, I would not expect any court to make the manufacturer liable.
The other scenario is more like the Oklahoma Toyota case, where an automation system CAUSED the accident. But that would be no different from any other vehicle defect. So again, not unlike what the law already has to contend with.
So, to respond to your quote:
Automated systems are now permitted as long as they can be overridden or deactivated by the driver. This has established the legal foundation for partially automated driving since control of the vehicle may now essentially be assumed by systems as well.
I read quotes like that and I think, "eeeh, not really." There are tons of automated systems in cars that cannot be overridden. A few are clearly safety related. ABS is one. I can deactivate my stability control, although it defaults to "on" again every time you start the car! But not ABS. ABS overrides the driver's activation of brake pedal. So, already today we have at least one safety-related automated system that CANNOT be overridden. Windshield wipers are another (ancient cars had hand cranks for the wipers). And of course, never mind engine controls and transmissions. All of these systems are performing tasks that once used to be completely under the control of the driver.
If the fuel injection malfunctions, causing an accident, would anyone complain that if we had retained Model T Ford fuel controls, that accident might have been avoided? No. Instead, they would treat that fuel injection problem like any other mechanical failure for which the manufacturer would be responsible. Ditto with ABS or stability control.
I think that the angst automation causes is only temporary, until your average joe has experienced it so long that he forgets it even exists. Let's not forget that today, the driver is really only in control of throttle (partially, except in cruise control and engine idle speed control), brakes (partially, except for ABS override or the Mercedes collision prevention system), and steering (partially, except for yaw control which in some cars acts on the steering system vs brakes).
So really, much like the hype about IoT, most of what we're talking about is a continuum. Not something brand new.
I suspect that much of the legal issue will end up being first untangled in the courts, rather than by plan. There are probably more variables related to the legal system than related to the ability of a car to be automated.
Hacker-created self driving cars will start showing up before the legal framework has been worked out. They'll be mixing with fully manual cars and branded augmented driveing systems. The results of that mixing will end up leading to legal precidents and then legislation.
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.