According to latest publication (thanks, Junko!), Toyota did botched a part of Camry's software.
While I fail to see how overreving engine could have beaten fully applied brakes (as claimed by driver), it may have been more subtle combined failure which affected ESP/ABS controls thus preventing brakes from developing full force.
I agree with others that drivers error is the most likely cause in this particular case...but the general question brought by Junko is much more interesting...how do we prove or disprove that software caused the problem? unless it is deterministic and repatable I don't see the way, this just too complex...I think we have to accept some level of randomness in our life, according to quantum mechanics any lake can freeze on sunny day, it just never happened yet...Kris
Personally, I believe this particular accident happened due to human error. An elderly driver is a risk factor. 1992 case described by James Chiles looks very similar. And it is next to impossible to imagine a car not stopping under heavy application of brakes (as Bookout claims).
However, there might be a rare glitch in ABS firmware or hardware. There are definite cases when car electronics was at fault. Russian Lada Kalina is notoriously known for rare yet extremely dangerous EPS problems (the EPS gear will turn to leftmost/rightmost position and lock there). There are anecdotal evidence of electronic ABS faults on certain Citroen C3's, too.
Anyway, I'll be looking forward for more technical information about the case, if it ever becomes public.
I know. The answer should be obvious, but tracking down a piece of code to say that it did it in a complex automotive system seems like an almost impossible task. At least. that's pretty much what NASA concluded in its original investigation in this case... they did not exclude the possibility that unintended acceleration might have been caused by software.
Since the NHTSA closed the case in early 2011 (after NASA's investigation which found no fault in Toyota electronic throttle control system), it appears that there have been a substantial amount of new probing done by embedded sytems experts. We will be publishing new findings in our upcoming reports on this matter.
I remember the hubbub a few years ago, and it appeared that the problem was not in the electronics at all. For that matter, even more years ago, the same problem was reported with Audis. Back then, it was determined that the accelerator pedal was slightly positioned to the left, compared with the average, so that the drivers were pushing hard on the accelerator when they thought they were braking.
Not sure if this is new information. During the initial Audi case, I remember that some experts on these matters said that drivers who experience these problems are often simply pushing on the wrong pedal, even though they are convinced they're not.
I think the answer to your question Yunko is a simple "yes". Software controls the hardware and hardware can kill so yes, the software can kill. Not sure where is a doubt here. The same logic applies to subway systems (many of them without drivers like here in Vancouver), planes on autor-pilot, etc. If we relive sofwtare from responsibility then all drivers will have a nice execuse of not doing their jobs properly ("it was the software that did it!"). So the only issue is whether software or hardware or human is responsible for a particular death and that is obviously case dependent...Kris
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.