It's pretty normal in human language for one word to have shades of meaning, as hacker/hack does. No big deal. But we do have another option: cracker is (or was) the term for the malicious hacker. When I first started working at programming magazine, my editor told me the difference btwn the two words and that programmers were very touchy on the subject. Hackers had a sense of pride and were more like self-appointed test engineers. They were doing offense but to help with your defense. It's a mission of "tough love" for them. It does (sort of) help to have another word to differentiate the two. The hacker vs. cracker as defined here by Chad Perrin in IT Security (2009):
I believe it's still useful to differentiate between hackers and security crackers, though, and that terms like "malicious security cracker" are sufficiently evocative and clear that their use actually helps make communication more effective than the common journalistic misuse of 'hacker'.
Can engineers think like malicious security crackers? Sure they can---the security crack is just like any problem, and engineers solve problems. Putting themselves in that mindset, however, may take a little doing if the system they are cracking is their own. Just like the earlier comment about writing vs. editing: if you write something, you need some time before you can see the flaws in your own work. If you step away from something and come back to it later, you're more likely to see the problems in the system.
junko, you ask: Can any company afford not to have someone who can think like hackers in the future?
And the answer (of course) is: depends. If you're lucky (or your system is boring) you may never be hacked.
Ultimately, management needs to balance the the very real and immediate additional engineering cost of making something more hacker resistant against more nebulous and future benefits such as reduced liability exposure, reputation fortification, avoidance of hacker-induced system failures, and the like.
Unfortunately, the inability to even entertain the thought that someone might want to hack their system tends to create a bias in favor of avoiding expense by doing nothing - a condition that seems to require bitter experience before it can change.
This is a confusing discussion, due to the the two vastly different definitions of "hacker" being used. If we apply the original definition, along the lines of "quick and dirty", then all engineers are also hackers if they have ever implemented something in software or hardware -- usually when pressed for time -- without following a rigorous, disciplined engineering approach. A software hack might be a few lines of code that you added or modified in an ad-hoc manner (maybe trial & error is a better description), which seems to fix a problem you're seeing, but it's not thoroughly tested, it was not the result of rigorous analysis, and it's definitely not production-worthy -- it's just a "hack" to get you past your immediate problem. Similarly a hardware hack might be that 2.2 pF capacitor you added to a PCB trace to slow down a signal and fix a timing problem. You didn't do any calculations or analysis, you just figured you'd try out a few different caps until you found one that did the trick, and that happened to be 2.2 pF.
But the discussion about the security of automotive systems was about the other definition of "hacker" -- the one that should rightly be called "attacker." Yes, technical skills are required, and a person with those skills might or might not be a degreed engineer, but that is hardly the point. The distinguishing characteristics of the attacker are (a) technical skill and (b) malicious intent. Engineers and non-engineers alike can possess either or both of those traits.
For a long time, there have been engineering disciplines that requierd thinking like an attacker. Consider the fields of cryptography, signal jamming and anti-jamming systems and the general subject of electronic countermeasures. The ability to think like an enemy when applying engineering creativity to solve these kinds of problems is not some special state of mind or moral code that "hackers" (attackers) possess and to which "real engineers" are somehow immune.
Engineers are interested in how things work, which is not so different from the hacker mentality. If an engineer can reverse-engineer something and figure out how it works, theoretically he/she should be able to gain a good understanding of the vulnerabilities that could be exploited.
"A hack" has traditionally meant a "quick and dirty fix." I don't know of any engineer who hasn't had to create his own share of hacks, from time to time. Hopefully, these are temporary fixes that get done right in short order. So a hacker in the more traditional sense is simply someone who tries things "quick and dirty."
Your use of "hacker" is the new one, to describe that annoying vermin that tries to break things (software, in this case). Not much more than a common vandal, but one who works on software.
Yes, it's hopefully difficult to turn engineers into scum. Hopefully. But if defending against software vandalism is posed as one of the design goals, then it should become just another aspect of the design effort. And I'm not surprised that test engineers are the ones called upon to try to break the code, as they try to break any other aspect of a product.
I think we're getting closer to the center of it. An engineer implements what's in the specification: "read some characters from the keyboard," or "convert the sensor voltage to degrees C." That becomes our goal and we implement it in a way that is correct and reliable. What the spec does not say, and what engineers (such as myself) fail to consider, is: "ensure an excess of characters typed cannot be written to the stack where they could form the address of a supervisory process," or perhaps "an excessive reading (say, in degrees F) must not result in rocket engine shut down during flight." Many of us have enough trouble seeing weaknesses in design that are vulnerable to accident, nevermind a determined attack. So indeed, we are not trained to think outside the box; in fact "outside the box" starts to sound cliched, so it might be understating the problem.
Meanwhile, a hacker's very goal (and of course I mean a hacker of the malevolent type) is to achieve the unauthorized and the unanticipated. It's a goal that isn't even opposed to ours - it's on the Z axis when we're watching X and Y.
So then of course we are told we must consider security, and we make the doors and windows tight and bulletproof, but the hackers just come down the chimney. Anyone remember a scene in one of the _Hitchhiker's_ books (the last one, I think) where one of the characters (Ford, probably) gains access to a secure building simply by opening a window? He found himself suspended (I forget how or why) outside one of the higher floors, and reasoned that what the building designers did not expect was for him to be there at all...
Can we engineers be taught to think that way? Probably, but I think we can agree it's not about designing an encryption key with more bits. In fact I wonder if there's any such thing as a course or book anywhere that teaches this kind of lesson!
Here's where something like gamification might come in handy. As RichQ and others have noted, it's a mindset thing. In this case what's needed may be more of a gamer mindset than that of an engineer or designer.
@LiketoBike, your points are well taken. It's not just a different mindset, but the right question to ask is: Can companies afford to have engineers to think like hackers? (by spending time and money)
But as system's complexity increases and everything connected to Internet, I am guessing it will eventually come down to this: Can any company afford not to have someone who can think like hackers in the future?