Junko, The concept of a camera inside watching the driver will never be accepted. At one company we had a system to watch the drivers for evaluation of driver drowsiness detection systems. That required some special agreements with the drivers union and withindividual drivers. And consider that at least one installation of GPS based vehicle location monitoring systems was removed after a court battle, because it somehow violated the drivers rights to privacy, (goofing off???). And I would certainly disable any such system installed in a vehicle that I owned. While I am being paid by a company they do own the time they are buying, so that would be different than watching me every moment that I am driving. Besides, such a system could easily be compromised to the point of being useless and worthless. And it was discovered many years ago, in our factories, that an automation system that requires constant human attention is of very marginal value at best. And that is fifty year old knowledge. I would hope that by now it would be understood that what would be best is a good driver perception enhancement system.
And it would be very good to discover just how the laws would handle the driverless vehicles before any more time and energy is spent going in that direction. Just because engineers can make a system do something does not make it a smart choice to really do it.
WKetel, this is definitely a topic that I would like to follow.
I am in Tokyo this week, covering ITS World Congress. When I brought up the subject of auopilot issue, he mentioned something that might be relevant. Companies like Volvo is already talking about placing a camera inside a car (not outside), so that the car will know what the driver is doing (or to see if the driver is engaged in actual driving). I found that fascinating!
Junko, you have brought up the one thing that will probably kep the self driving cars off the road forever, which is the question of who is responsibel when the system fails and there is an accident. That is the question that must be answered before the cars can be let onto the roads, at least I hope that it is answered. And if it turns out that the answer is that I am responsible, well, then I certainly don't want some computer making the driving decisions for me, since I already know that computers don't think the way that I do. So if I don't want that system making decisions that I am responsible for, then why should I pay to own such a system? And probably a few others will make that same decisions as well. So maybe that will be the end of it. If one does not wish to drive, there is already the bus and the train. Both have others doing the driving.
Damn, WKetel. Actually, what you are talking about here is really profound.
Few is willing to touch the topic, let alone acknowledging the existence of challenges associated with "handing exceptions."
As for self-driving cars (i know, i know, it won't happen next year), who will be ultimately held resposible for potential accidents that might have anything to do with "handling exceptions"? I think I already know the answer. It won't be carmakers. It will be drivers.
When approaching a light that turns yellow I always check the mirror before deciding whether to stop or keep on going. Would I trust a programmer to code the same reaction? Based on the many dumb things that human-coded software does, NOPE!
My concern about those drivers who seem to suffer from the 17 second reaction times when the light turns green is what are their reacton times when it turns red? A long reaction time there could be really bad for hose already stopped. And I do presume that I have some small right to not have other drivers bash into me, at least when I am following the rules and driving very safely. Actually, the one time that a driver did bash into my car I was stopped at a red light. Unfortunately I was not watching in the mirror, or I would have moved. Would a computer driven car be able to get out of the way of a car that could not stop? I did do that once, made a right turn on red and the driver coming up behind me slid into the spot where I had been stopped. Yes, it was inconvenient but not nearly as inconvenient as getting bashed would have been. Can the computer cars do that? It is an exception, you know.
"I guess eventually cars will drive them selves. I don't know if that is good or not, but I don't know if I want to lose control."
Arthur C. Clarke phrased a similar thought in his 1953 novel "Childhood's End". Chapter 7, 4th paragraph: "George Greggson, who had an old-fashioned dislike of automatic landings, readjusted the rate-of-descent control before answering."
Yet another of ACC's futuristic predictions that is now becoming reality.
@junko, Handling execeptions is being removed from our culture, not only in driving, but also in many areas. It is the mindset of a lot of sadly uninformed people that handling exceptions is "somebody else's job", and that we must let them do it, no matter what. The problem is that real life does have exceptions, not only in driving down the road, but in many other areas as well. But the very first place where a lack of exception handling would be a problem is in driving. Why doesn't everybody see that?
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.