@WKetel, you articulated what I had thought better in your comments here.
Every time when I get a briefing on ADAS, companies stress the importance of "getting human judgements out of the equastion." And yet, as ADAS is paving its way to the future of self-driving cars, I am shocked that very few have publicly discussed the issue of "preparing drivers to handle the exception situation."
@Scramjet, your posting confirms the idea that drivers and pilots need to be prepared to handle the exception situations, which are exactly the type of things that computer systems will never be able to handle. It is inconvenient when using a computer, it could be fatal driving a car. Picture that "blue screen" at 60MPH, and you can realize that there would not be any good possible ending available.
WKetel I agree with you. Unfortunately, your logic doesn't necessarily apply to the traffic laws.
Just because something has the virtue of being an obvious truth, and the opposite is more expensive, less reliable and a fundamentally bad choice, doesn't mean that it won't be codified into a bad law and we'd be stuck with it.
The t-shirt said that "If you're not part of the solution, there's a lot of money to be had in prolonging the problem." The tendency of government is to take something that doesn't work, decide that the reason it didn't work was they didn't do enough of whatever it was, go all-out with the bad idea and then start a new program to stamp out the new problem.
Start with the premise that "Drivers that are exposed to driverless technology become worse drivers through lack of practice", and see where it goes.
@ScRamjet, this is a great input. Thanks for sharing. So, auto-pilot in fact requires "more" training in a way to see a pilot's attention span in various conditions. That is fascinating...and reassuring.
To do such a one-to-one trainging for very driver who will be "driving" a self-driving car is not realistic, and yet, I see a much parallel needs to be drawn to an extent....
My brother is a Pilot for a major Airline and has flown Intercontinental and local day runs into tight approach airports.
He talks about those incidents frankly. - Poor Pilot Training is his expectation of the root error.
Pilots must be tested in simulators and by instructors riding along periodically, make these scenarios part of that training. Give them a no error flight and watch the attention span. Give them a stressful flight and watch for fatigue. Give them a bad weather flight with a tough landing, watch the reaction times and the precision of control. All these factors must be evaluated.
Training must be focused per the pilot in training and his / her areas of weakness.
He touts Capt. Sullivan as what to expect, not an exception. Unfortunately, He also feels this is not the norm. The Norm is a team that can fly the usual weather and not upset the passengers or bend the plane. That all dead engines Emergency took Icy calm and solid decision making. You only get that with pilot that are experienced and still have the mindset of making every decision fast and correct.
These self-driving cars will need to be hybrids. Autonomous when you want to shut out the frustrating morning commute while you work and hands-on when you are driving Highway 1 and want the expereince of that ultimate driving machine.
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.