A few opinions to add, and why I think those opinions are valid:
- I fly
- I drive cars
- I write firmware, software, and do a bit of hardware as well
- I'm also qualified to drive trains
Given this, especially the trains bit, I do understand fatigue, and how it can degrade one's performance without much warning.
So, here's the scenario:
A person who doesn't realize how tired they are, maybe coming home from a long shift or a party (but without being inebriated) is in their auto-drive vehicle.
They start getting a fairly large number of false-alarms, and are also eager to get home and relax. Hidden in that stack of "annoying distractions" is a REAL situation which will occur at some time during the false alarms.
A person will tend to develop a wrote, nonthinking "muscle-memory" reaction to numerous false-alarms, especially if they are all similar, and occur frequently enough. An untrained person may even become emotionally affected, "annoyed" by their car continually insisting that they back up its logic with human intervention.
For example, in aviation, when there's a stall-warning sound, we quickly look at the ASI and simultaneously relax back-pressure on the stick, or sometimes even add forward pressure. Depending on the aircraft, we might also be vigilant for a wing dropping in either direction.
In aircraft, in a pitot-icing and stall-sensor icing scenario, near VNE, this could be almost instantly fatal. False-alarm stall warnings can also occur in gusty or turbulent conditions. Again, if the speed is already too high, causing the aircraft to speed up in this case could be very bad.
Luckily, the stall-warning doesn't false-alarm very often. But if it does, and the pilot is fatigued and suffering from "get-home-itis" their reactions may be inappropriate.
In a train, the repeated warning (such as slip/slide on a rainy or icy day) might be masking a more insidious problem, like critical loss of braking authority or the inability to climb a hill safely. Malfuntioning safety equipment, which continually warns of minor or nonexistent situations, only contributes to the workload and fatigue factor.
So, I think that a "check-ride" in an automatic driving car, with an experienced co-driver is probably a very good idea, if only to brief the new driver on the kinds of situations that might arise.
I think the scandal of one or two self-drive cars injuring someone due to "operator error" would far outweigh the resistance to updake of the consumer against self-drive cars due to some "mandatory briefing" drive that had to occur to take ownership of the car.
It seems ludicrous that a new and potentially very dangerous large piece of metal, regardless of how smart it is, shouldn't require any additional training at all to operate safely. That could simply be some exercises in "taking over" the controls when the auto-disconnect alarm sounds, in various scenarios.
I agree, driving skills are going downhil...with population aging it will only gets worse...and with cars taking over driving it will get even worse, there will be no way for an average driver to take over in a difficult sitution, such an imaginary individual will do not know what to do
The problem is not the carrying of passengers, it is the competency of the individual involved.
Think occasional Saturday small plane Pilot VS Capt Sully who made the famous landing in the Hudson River.
There is your comparison. My point is that the drivers on the road today are not ready to suddenly take over an emergency situation. Many of them are barely competent when they started out coming into that situation already in control and aware of the road.
Pilots are well trained and periodically tested. Drivers are not.
Pilots generally will have some time to evaluate the problem before it becomes critical, Since they are flying at controllled spacing of thousands of feet of altitude. Drivers do not have 1) controlled spacing from anything, 2)Periodic Training and retesting.
I will not own a self driving car unless it has a NASCAR rated protection cage in it.
What about the "normal" situation where you are driving on a multi-lane highway and as you start to approach a car on the left (normal passing lane) you notice the driver ahead is looking in your lane (possibly telegraphing an unsignaled lane change) as an experienced driver I look out for that all the time and slow down to avoid getting hit. Would we expect that kind of insight from an autonomous car or does the whole autonomous car movement require that EVERYONE be in autonomous cars? I have seen cars swerve at nothing simply because the driver sneezed! Again, as a long time driver I was able to notice the behavior (driver shaking) and avoid getting hit. What happens if I get hit into an accident while driving my autonomous car, am I at fault or?
I have driven a robot using radio control with a tight user feedback loop(www.usfirst.org) and saw real ugly happen quickly. At 120lb and traveling 10 to 18feet/sec a failed sensor can cause serious mayhem. Even with the watchdog monitors and disable/kill switches it takes some time to stop the robot imagine a 1/2 or 3/4 ton car moving at 65mph! It is mind boggling to think about the possible results. Even planes use pilots with all their capacity to autopilot there is just no substitute for a pilot.
Not that new. In an article in the New Scientist, a lawyer drew a parallel to horse drawn cariages; also a somewhat intelligent autonomous way of transport. An horses can be spooked and accelerate uncontrolled.
Why would cars be driving around without occupants?
To go where future occupants are to pick them up, go park somewhere after dropping occupants off, take itself in for maintenance, drive to a store to have purchases loaded inside... you know, basically most of the things cars do WITH occupants.