TOKYO — All eyes have been fixed on the autonomous car at the ITS World Congress here this week. For carmakers, regulators, and technology suppliers who have gathered here from all over the world, the big concern is not so much the bells and whistles of self-driving cars, but more about deployment strategies for them.
Debates here covered everything from human-machine interface, safety, and reliability to societal acceptance and a whole new legal landscape.
One issue, however, was artfully dodged by panelists on a Tuesday, Oct. 15, executive session called "Autonomous Vehicles -- the Path to Implementation."
Put simply, how does a self-driving car, when faced with a crisis beyond its program, hand control back over to the (human) driver?
This issue is near and dear to the hearts of EE Times readers, as shown by the reader discussion at Whatís Going Through This Driverís Head? The debate was ignited when an EE Times reader compared a pilot flying a plane on autopilot and a driver in a self-driving car. Referencing his brother, a pilot for a major airline, the reader wrote:
Pilots must be tested in simulators and by instructors riding along periodically, make these scenarios part of that training. Give them a no error flight and watch the attention span. Give them a stressful flight and watch for fatigue. Give them a bad weather flight with a tough landing; watch the reaction times and the precision of control. All these factors must be evaluated.
At issue here is how well a self-driving car like Google Car can handle "exceptions" and how carmakers expect a driver to take over the control of the self-driving car in a critical situation. Another EE Times reader pointed out:
This confirms the idea that drivers and pilots need to be prepared to handle the exception situations, which are exactly the type of things that computer systems will never be able to handle. It is inconvenient when using a computer, it could be fatal driving a car. Picture that "blue screen" at 60MPH, and you can realize that there would not be any good possible ending available.
Asked by EE Times, after the ITS session, about how self-driving cars are designed to handle exceptions, Ron Medford, Google's director of safety for self-driving cars, deflected the question. He said, "Readiness of drivers to take over the control [of a self-driving car at the moment's notice] is simply not understood yet."
Citing a number of studies now underway, Medford explained that, not only companies like Google, but also a lot of government officials and regulators are all eager for the results.
Autopilot and the self-driving car: not the same thing?
Medford, former deputy director of the National Highway Traffic Safety Administration, took a position at Google earlier this year. He held down the No. 2 spot at NHTSA since 2009, originally joining the organization in 2003 after a career at the Consumer Product Safety Commission.
Medford, clearly, had thought about the issue. But he was careful in stating only that Google hopes for a day when Google's autonomous cars are advanced enough to take the human judgment out of the equation. But, as he acknowledged, that is still in far future.
Continental's Christoph Hagedorn with a microphone (left) and Google's Ron Medford at ITS executive session entitled "Autonomous Vehicles -- the Path to Implementation."
EE Times posed the same question to Christoph Hagedorn, president and CEO of Continental Automotive Corporation Japan: "Can you draw a parallel between a pilot flying an airplane on autopilot and a driver in a self-driving car?"