By putting external communications capability in the automotive control systems , we are opening up the whole system to hacking. We need to isolate the critical subsystems from the communications enabled modules to prevent any possibility of accidents happening because of such hacking.
Everybody with a little knowledg of the reasons that cause leakages -thus- the possibities to break into code and hack:
Nr1: Users that do silly things
Nr2: Flash. Adobe really f*cked up here (!)
Nr3: Microsoft, decades of enormous leakages, shame for such a big company
Nr4: 'Embedded Systems' that are really *NOT* embedded like Win-CE. Much to complicated solutions for minor problems: Keep it simple stupid. Keep your system simple just for the task that it has to do. This will help you to avoid hacking into your (SCADA) systems too.
Nr4 is an on-topic issue and will be a serious thread for the coming years. It is unwise to control a SCADA system of a power plant with Windoze PC's to make a little statement here.
Let us all be warned: Nothing to be scared about and not a reason to introduce silly laws (like in the USA), but do your job right and warn people around you immediately as soon as you detect risks. Even in automotive. Otherwise it is just a matter of time before someone hacks into a motor management controller or even worse...
Yes, software will play a significant role here in solving these challenges as it seems to be the cause, but it seems like the real solution lies in the systems engineering. I think the twist here is that traditionally systems engineering was a predominantly hardware focused discipline and now software must have equal footing. The implications of complexity, performance, quality, cost, reuse, and others must all be weighed as potential architectures are selected. One might be argue that one approach is more viable because it drives manufacturing costs down and leaves software to solve the real challenge. The good news is that if this dialogue is actually happening and the software, hardware, and systems engineering team are actually arguing proactively while designing the system, they might get it right. http://twitter.com/#!/mfklassen
The vacuum tube/transistor does not know that it is part of an amplifier. Why that amplifier might even be inherantly unstable for all the vacuum tube / transistor knows.
Yes engineers need to address issues of ethics, but their authority to do so is limited by uncertainty of the significance of the ethical concern, and the politics of situation (whistle blower statutes are of little value when you are uncertain of the significance of your ethical concern).
Engineers have often ducked responsibility for their actions. Hackers probe supposedly secure systems for weaknesses. The question is, what happens to the knowledge when an exploit is found?
Doctors have a Hippocratic oath not to allow their skills to be misused, but unscrupulous practitioners helped torturers develop 'truth drugs', clinical psychologists advise on effective interrogation, and arms dealers develop nerve gases. But none of those things is as easy as deploying a hacking script downloaded from the internet, put there by an expert with ill or idiotic intent.
Until engineers(software and hardware) become truly professional, and treat such knowledge more responsibly, executives need to think about how saving money on IT security might be seen as risky behaviour and come back to haunt them. Their advisors also need to tell them when an embedded system has potential vulnerabilities, or they may be liable as well.
I again want to stress, the problem is criminal activity, not hacking. Hackers try to improve or adapt systems for new uses. Criminals try to subvert, damage, or steal the system or its components.
As electronics increase in vehicle content, the manufacturers must protect the vehicle from being compromised from within as well as from without. The technology exists, if the vehicle is of value, the customers will demand the added protection.