As a hardware guy this development really really aways has worried me. I can't do that. My boards (Electronics) always have to work.
I do wonder for years why this has gone wrong. The toyota example is due to bad thought firmware design, so basically software. Bad software on lethal moving vehicles, something to worry about.
Maybe in 10 years we have to agree with 'terms of agreement' if we want to buy our cars...? Let me suggest not to go this way: It is better to get all legal to bad software design. This is where they should put their attention to, to give my 5 cents...
Pre-impact EDR download data is very limited in extent. A typical data download matrix in Bosch EDR format for the time before the crash might look like this:
The sampling is generally at 1 second intervals before the crash - in this case there are six samples.
Four variables are recorded against time: Speed, Brake Switch ON or OFF, Accelerator rate (a voltage related to the accelerator position) and Engine RPM
Engine RPM are recorded to the nearest 400 RPM, which means that engine RPM of 799 would be recorded as 400, whereas 800 RPM up to 1199 RPM would be recorded as 800 RPM
Whether the brake switch is ON or OFF is recorded, but not the brake pressure
Time is recorded with reference to impact, but absolute date and time of impact are not recorded.
This is a very sparse data matrix and only of limited use in determining what was going on before impact.
In an ideal world an automobile black box would record a number of other variables to allow cross checking. The sampling rate would be increased probably by a factor of about a thousand. The time scale would be extended back to perhaps half an hour before the crash and the record would be date and time stamped in some way. There would also be video recording. Amongst the most important variables recorded would be the system voltage and current, throttle PWM duty cycle and throttle angle. One might then get some idea as to whether Task X had been having a hissy fit at the kitchen sink.
The data recorder would have to be entirely independent of the Electronic throttle control and the CAN bus.
Toyota and other automobile manufacturers have been able to claim that a record such as the one above "proves" that the driver did not have their foot on the brake - ergo they must have had their foot on the accelerator pedal when they meant to put it on the brake. In other words, they use the EDR record to insinuate that the driver has been "startled" into making a pedal error. [ This process of condemnation, it seems to me, is roughly equivalent to the medaeval process of testing for witches with a ducking stool: if the wretched woman sank in the pond she was proven innocent but drowned anyway, and if she rose to the surface she was proven a witch and was burnt at the stake. Rather unfair, but guaranteed to get rid of witches.]
I have examined a number of EDR records and have written two reports that are in the public domain. In one particular case I have been able to compare EDR and video records and I could find no correlation whatsoever between the two. I would be very pleased to make these two reports available - they are not subject to any gagging order.
I was very interested to hear that:
"He (Michael Barr) also told EE Times that the expert group found thatToyota's black box can malfunction during unintended acceleration specifically, and this will cause the black box to falsely report no braking."
It explains my own findings! It would be interesting to know if anyone else has had a similar experience.
Stuxnet is now open source and configurable and could potentially be used by anyone.
"I just hate to think what the casual chip hacker can do to the operating parameters of a modern car EMU."
The mind boggles at the possibility of a chip hacker mixing a bit of scrambled egg with what was described in the Bookout case as "spaghetti" software. What would happen if they tried to reorganize the "kitchen sink" tasks as part of the tuning process?
"As a past assurance engineer for a very large fire protection agency, the ability for the wrong operating parameters to be loaded into fire alarms, caused me to order a change in the upgrading and loading software for the fire alarms we used".
You raise a very important issue. In any manufacturing organization design variation has to be controlled very carefully. I expect that Toyota will have the assembly process under pretty good control with the correct parts delivered to assembly points for the particular batch build. However, with software change control may be more difficult.
I wonder how Toyota control the issue of upgrade software to their dealers and do they inform owners when a software upgrade has been done?
I will put excerpts here, in Betsy Benjaminson's own words.
First and most shocking were the reports horrified drivers wrote about their runaway cars. Second were startling emails Toyota's engineers had sent each other. They were searching for UA's root causes, but they could not seem to find them.
They sometimes admitted it was the electronic parts, the engine computer, the software, or interference by radio waves. Meanwhile, efforts were made to find floor mats that would trap gas pedals and conveniently explain UA. The R&D chief admitted that incompletely developed cars had gone into production and that quality control of parts was poor or non-existent.
Third, I read many descriptions by executives and managers of how they had hoodwinked regulators, courts, and even Congress, by withholding, omitting, or misstating facts.
Last, and most damning, I found Toyota's press releases to be bland reassurances obviously meant to help maintain public belief in the safety of Toyota's cars—despite providing no evidence to support those reassurances. I saw a huge gap between the hard facts known by engineers and executives and the make-believe produced for public consumption by Toyota's PR department.
And there is just a plain mix up? I believe I read that Ariane 5 had Ariane 4 software loaded for it's first time off? Even here it's not a mistake or an accident but a failure in process control.
As a past assurance engineer for a very large fire protection agency, the ability for the wrong operating parameters to be loaded into fire alarms, caused me to order a change in the upgrading and loading software for the fire alarms we used.
I just hate to think what the casual chip hacker can do to the operating parameters of a modern car EMU.
Am I paranoid or could I also worry about some software being sabotaged for state or commercial imperative?
@Crusty: Yes, there's nothing new under the sun. This story is an old one, as you say, especially for the automotive industry. Short memory syndrome and complacency---it is human nature after all.
It's precisely because of Ralph Nader's book and the way the American car manufacturers treated customers that generations of the American car-buying public had so much faith in basic quality and value of Japanese cars. Too bad Toyota squandered that good faith. Consumers had the impression that Japanese car makers had a built-in process that produced good cars -- cars you wouldn't have to take to the shop constantly -- and that wouldn't kill or injure you (you were still capable of getting yourself in trouble, in the wrong spot at the wrong time---but that's just the nature of driving a car).
That reputation for quality followed by good ratings was why Americans bought Toyota. This case definitely damages that reputation more than it would have if they fessed up in the beginning. Consumers often have longer, irrational memory: some people won't buy American because they remember that bad days of the industry, even though times may have changed.
As software becomes more complex and does more in cars, I hope software teams follow good processes for testing their code.
Yes, given that this has been happening for a number of years, even if infrequently, one wonders why the people involved in the design of this software didn't have second thoughts about their architecture. You know, while commuting to and from work, for instance. Even if the heavies didn't know, seems hard to believe that no one had on of those "oh sh**" moments, going back over the code. No?
@Susan: I have been following this article and threads and thought has come to me that this is not a new situation in the transport industry. It goes a long way back before software, I was looking up the worst all time cars and googled Ralph Nader who wrote a book called "Unsafe At Any Speed,". I also know of buses that had power surges for no reason at all and railway safety has only been driven forward to safer practices by disasters.
One of the big problems is that corporations have a very short memory regarding how they came to a disaster, this is because the people who were there at the time of a disater / design flaw move on or retire and the cycle starts again.
Is it possible to build in a management process to a company that learns from previous problems or wants to?
In reading this article, I felt sympathy for the people who died and their families. Toyota could have made it simplier on them and admitted the problem had to do with software. They had to recall the vehicles anyways.
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.