Curious as to what you mean by 'seatbelt regulations'? For a few dollars of BOM cost? Could you please elaborate? I'm having a hard time picturing what this might be.
Also, near 0 deaths with minimal costs? Again, you're a little short on details here. We could achieve near 0 deaths with a hardware mandated speed control of 25 mph or less. Life would be grand. Or outlaw personal transportation altogether.
Of course, safety measures with political or economic costs have push-back. For good reason. We live in a republic where the People have say in how the People are governed. We are willing to accept 35K deaths per year because in the aggregate, we, the People have decided that the risk to our 'own' lives is acceptable for the benefits we derive from our forms of transportation.
As for Mr. Pratt. I'm sorry he's raining on your parade, but he is simply stating what most thinking people have been trying to say all along. You are welcome to continue to think that autonomous vehicles will be mainstreamed in the next 20 years, but there are HUGE hurdles to be overcome for a solution that may not even be desired. This will be a many decades process.
This means Toyota is dead in the water and they are just trash-talking the competition.
The 35k deaths argument is built to mislead.
We can't accept fatalities where the robot is at fault, that is true and the robot does need to avoid some accidents humans wouldn't be able to avoid but that's already being done today.
Realistically we could achieve near 0 deaths with human drivers today at minimal costs but the world isn't trying all that hard. If a safety measure has economic or political costs, there is a strong pushback.
35k deaths per year and many many more permanently debilitating injuries but the vast majority are avoidable. Just proper seat belt regulations would save many thousands. It would add a few $ to the BOM, generate some extra revenue for the Gov and some votes would be lost but that cost is too high to save many thousands.
The robo-car must not cause deaths and permanently debilitating injuries but that doesn't mean that it can't cause any incidents at all or that it must avoid all incidents when another vehicle is at fault. Can Toyota build an autonomous tank traveling at 3miles per hour that doesn't kill anyone? If they can, they are almost there.
If they really think as they claim, they are reapeting past mistakes.They focus on the problems without looking for solutions at all, just like they did with electric. Take the entire system(car), set the goal and find ways to get there instead of setting a goal and focusing on why you can't reach it.
I agree. It's analagous to flight training using "unuual attitudes" where the pilot being trained dons a visually restrictive hood, closes his eyes while the instructor places the aircraft in a abnormal flight attitude and possibly retrims the flight controls, then hands the aircraft over to the student who must open his eyes, assess the situation (in this case from insturments only), understand the recovery options, and safely resume flying the aircraft. Even that doesn't take 15 seconds but I would argue the impending danger is less (it's done thousands of feet in the air with no other aircarft close by), the recovery therefore less demanding, the student is fully aware the 'emergency' is coming, and the student has a higher than normal aptitude for the skill being developed. 15 seconds for a lower 6-sigma driver in traffic sounds about right.
"We do not accept death as a byproduct of an automation program. Period - end of story."
What about the simple case of a electromechanical circuit breaker in an electrical panel? If the breaker fails, a house burns down, and someone dies, we don't do away with circuit breakers or stop electrifying houses. I think there will be acceptance of some level of death caused by errors, as long as it is less on average than the alternative of human controlled driving.
By 15 seoncds, I am assuming that he was talking about a situation where a person (driver) is doing something else -- like checking emails or watching video; then suddenly the car is asking him to take over.
That is, they need to reduce fatalities by 90%. IMHO that's the mark where people will accept autonomous vehicles over human drivers. The lawyer problem will be easily solved like it always is, via insurance. If there's 10x less chance of killing someone, the insurance will even be cheaper. Doesn't matter whether the car owner or carmaker pay for it, someone will.
But on the whole I agree with the Toyota exec. There's way too much hype, and idiots out there talking about having autonomous vehicles on the road in 2020. I just want to see them before I get too old to safely drive myself, which is at least several decades away. If they come sooner, great, as I'd prefer taking a 1000 mile trip by car if I could work/read/sleep during the trip instead of hassling with the TSA and stupid flight delays!
Finally an honest discussion of the automation of vehicles. I've been saying this for a while now but who am I? We do not accept death as a byproduct of an automation program. Period - end of story.
We already see in the news the uproar over death's in self driving enabled vehicles. It would only grow to the point where the lawyers will step in and ruin the entire concept. Expensive lawsuits will make it impossible to even develop the technology.
At last some sensibility coming through amongst the hype of fully autonomous automobiles. Skipping mode 2 and 3 also makes a lot of sense. There is no way society will accept a mode that requires the driver to take over in an emergency. Within 60 seconds of monitoring, any human will drift off to check text messages, eat a sandwich and make out in the back seat.