@fmotta, your premise, as described here, "the primary reason most 'secure' areas remain so is that Engineers have less motivation to hack than they do to create. That ratio of hack vs create is shifting a lot and fast," is an interesting one.
Especially the part you mention the ratio of "hack vs. creat."
With the growing use of RF in automotive control and access the risk is obviously increasing.
With the ready availability of low-cost Software Defined Radios (SDR) and computing power is outlandishly high (Many of us have many many cores/cpus/gpus that we have just languishing most of the time) the ability to capture and crack any security increases.
With a small amount of effort I was able to reverse engineer a keyfob with true hobby class parts. Since this was to replace the ~$800 replacement fob with one of my own design for a friend then this was sanctioned and legal (and she loves the new fob). It will be easier the next time since I will have a Nuand SDR and more experience.
Simply put. If a mildly equiped person can do this then a sophisticated attack can happen.
Risk? Yes! TO what level? Well, it has been my premise that the primary reason most "secure" areas remain so is that Engineers have less motivation to hack than they do to create. That ratio of hack vs create is shifting a lot and fast.
Yes, I had thought of the ABS example. It bears closer inspection. The correct way to implement such safety features is via a tight closed loop, between the braking system and the wheel sensors, where the system fails safe (sensor failure does not incapacitate the brakes).
This is what I'm getting at, though. It's certainly possible to design an ABS system to be hackable or just plain dangerous. So you don't do this. The ABS feedback loop remains hardwired, EVEN IF you have sensors in the system that announce faults, i.e. one-way monitoring signals only.
If this were a supermarket tabloid, I'd call this fear mongering. However, given that it's here on EE Times, I'd say it's fodder to prompt important thinking and discussion subjects for engineers.
Automobile systems are more closed off than are personal computers, but they are opening up and will continue to do so. Smart phones were developed in a time period where everyone was very clearly aware of the risks of compromise, yet they still have vulnerabilities. I don't at all think it's a stretch to get to where cars are open and connected enough to be quite vulnerable.
Mechanical systems can break and can be tampered with. One key difference today is that the threshold of action is so much lower than in the physical world. Some people have always been willing to shoplift or otherwise steel, but not that many. By going remote over the Internet, orders of magnitude more people are willing to steel music than would even think about physically shoplifting a CD.
I fear that the same will someday apply to cars. Very few people are willing to actually crawl under a car and cut the brake lines. When connected, however, the threshold is very much lower and far more people will be willing to mess with cars digitally than physically.
It's a sad eventuality that we need security solutions for and now is the time to be designing those solutions; not after car hacking is someone's pastime.
Maybe not a reason to hyperventilate, but you should also worry about the brakes, and not just the throttle. An earlier paper by this same group (2010, www.autosec.org) details successful efforts to breach a car remotely, and attaining a significant level of control. This includes disabling or applying the brakes, applying the throttle, etc.
It's easy to envision controlling the throttle via cruise control, but how do you disable hydraulic brakes? ABS! In full pulse mode, the ABS system essentially renders the brakes inoperable. Case in point - a failed wheel speed sensor on my truck caused the ABS to engage when I slowed below 5MPH, making it nearly impossible to stop the truck. GM even issued a recall due to this condition.
This wasn't hacking, of course, but it demonstrates how a system designed to increase safety can actually cause a vehicle to become unsafe due to failure or tampering.
As far as applying the brakes, many traction control systems and all yaw control systems allow the computer to do this. Even the ignition key can be overridden by telematics systems such as OnStar, or even remote-start systems. Mechanical steering systems might not be hackable, but that may be the only control you still have.
The LEDs in many (if not all) web cams are under software control and can be disabled. Not all webcams have LEDs indicating their status. I can think of at least four different laptops that don't have LEDs.
Oh, I forgot to add this. One article talks about how a malicious mechanic can input viruses or such through the OBD-II connector. No doubt, attack vectors of that sort may well exist. But why pretend that this is a new phenomenon?
Incompetent mechanics, never mind malicious ones, never mind the amateur backyard mechanic, can far more easily fail to bleed the brake lines properly. So that when the driver least expects it, the brakes won't work. As easy as it is to NOT bleed brake lines properly, there aren't any safeguards against it.
And it's not necessary to point out that sabotage hardly requires electronic intervention.
FUD consists of telling partial facts for dramatic effect. The vast majority of cars are still designed as I described, but more importantly, those (still) few that do integrate functions, e.g. to coax the driver back into his lane, do so in a way that these automatic safety features can be overridden with driver input. If a brake is applied to "steer" the car, or the wheel is nudged, these actions do NOT eliminate driver input. These actions do NOT take away the driver's ability to turn the wheel or apply the brakes.
Of course, they could be designed stupidly, but on a case by case basis, they aren't. It's a bit like making a big whoop about cruise control. A little late for that, because it's been around way too long to be good FUD fodder any longer. The cruise control won't get away from you, if you either cancel it using the switch or apply the brakes.
There are ways to design such controls safely. And that is, the manual override is USUALLY designed as an override, although priority is given to reduce kinetic energy. So yes, a safety feature that Mercedes offers will cause the brakes to be applied when an inattenbtive driver is about to stike an obstacle. Or, when this applies, local control is designed to override remote control. It is probably true, though, that drivers need to be made aware of these safety features, how they might misbehave, and actions to take when they do misbehave.
Then again, hydraulic brake lines can rupture, mechanical steering gear can seize up, tires can be punctured, and drivers fall asleep at the wheel.
Yet more scare mongering by the media, but I wasn't expecting it to be EE Times.
There is already basic Networking in cars, since many Sat-Nav systems in modern cars incorporate a 2G or 3G GPRS data Modems in order to pick up information about conjestion and map updates and provide e-Call functions. However, the Sat-Nav is designed to be an isolated system within a car and can't influence other systems no matter how much you changed its software by hacking. The next step for Sat-Navs is to provide 4G services in the car, so your passengers can use the Web via a WiFi/WiFi Direct link inside the car. This Network would essentially be isolated from critical systems, or some critical systems could send information to this Network about the status of the vehicle to be sent over GPRS to your car dealer. Through simple programming of the critical systems it would be extremely easy to stop any attempt to allow the Network to alter the critical systems firmware (no write access), so the Network remains isolated inside the car.
Car 'critical systems' don't run Windows 7 or 8...or Linux...which a 15 year old could hack....car manufacturers have a lot more sense !