JCreasey, Autonomous vehicles can actually make the problem less difficult, not in overall complexity, but oversight of the situation, situational awareness. In the Toyota scenario that we're discussing there is no way to independently judge intent, or consequences,
Bert, I had a mechanical throttle malfunction also. It was '53 Buick V8 with a Dynaflow automatic transmission. Somehow an acceleration attempt over compressed a worn motor mount to the extent that the engine torque rotated the engine block, relative to the engine compartment, beyond design tolerance for the integrity of the totally mechanical carburetor linkage and it jammed, wide open. I quickly turned off the key, which brought me to problem number two, no power steering and I was on a winding road and had to turn the ignition back on to steer. A fortunate section of straight road allowed me to kill the engine and bring it to a safe stop.
@RoboticsDeveloper, good to hear from you again. "Given all that I read in the article it makes me quite concerned about self driving cars."
The lawyers must be salivating at the thought of self-driving cars. Accidents will occur even then, and there will be no driver error as the cause. The blame will fall to the auto makers, designers of the roads, municipalities of these raids are not properly maintained, and so on.
Rich Pell, that was what I read into that statement. What worries me more is that it was possible to record false data in the first place. That seems to be a failure in the design that should have been caught early in the design review process. All that said, I wonder how many drivers have been wrongly accused of being the cause when the Blackbox data is used and treated like it is an impartial data collection means??? Makes me wonder, for example: jury members for this trial NEEDED to have some technical understanding / discernement otherwise how could they come to the right conclusion? If my dad had been on the jury most if not all of this would have been quite over his head. This aspect of the trial I find very interesting and I wonder what the jury selection process entailed.
MeasurementBlues, I thought the article implied the memory corruption "may" have caused the bit flip. Given all that I read in the article it makes me quite concerned about self driving cars. I hope that there will be standards employed simular to the FDA's life critical devices. With a little (very little) experience with fail safe coding and hardware design it seems obvious to me that cabling could fail in many ways. Cable signal design should have provided for an easy means of detection of a single or multiple line cable falure, sort of like the old active low signals with pull ups for backplanes. It is important to keep in mind the technical challenges involved in coding but I wonder if there should be an electronic override feature that provides either a fresh reload of code (if it is possible to do safely - I don't know what the reload/power up looks like) or a fully parrallel "simple" processor to allow for "direct" user control with minimal bells and features.. Just thinking that if nothing else, being able to TAKE back control in a manual as possible means would be at least reasurring.
Was the rate of acceleration was specified in any of the reports both the agencies had provided. Or in the event of accident it was found that how much approximate was the rate at which the car had accelerated?
The "failed to comply" simply refers to the general OSEK compliance testing.
There is (or was - it's been a while since I was involved in OSEK) a requirement that you submit your OSEK implementation for compliance testing before you are allowed to call it an OSEK-compliant operating system. Toyota's OSEK apparently hadn't been submitted for this testing, so was not officially OSEK-compliant (and couldn't legally refer to their OS as OSEK - as the trademark terms for OSEK say the only permitted use is for compliant OSs)