Thanks, Junko. I love this discussion. In professional engineering, a hack doesn't mean high quality. So maybe it's a pride thing. BTW, to comment on this article, make sure you log in separately because for some reason this article hangs on the log in screen. Hmmm, unless you have a hack to get around that problem, you'll have to use the workaround for now.
First, we need to define which "hacker" you're referring to. Initially it seems like you're talking about someone with malicious intent. Frankly, I see no correlation between engineers and hackers in that sense. The equipment may be similar but it is the mallice that is the differentiating factor.
Toward the middle of the article you're discussing "hackers" in the sense of people who are building things but aren't necessarily fully educated or involved in their career path. This is the one I'd like to focus on. I think many engineers go home at the end of the day and hack. They find a problem and they solve it, and since they're not doing it for work, they aren't following SOP. They're exploring and chasing passions and obsessions, often only for the purpose of doing it. I think engineers can be hackers, and many uneducated hackers can go on to be engineers.
One interesting thing I've seen is when a hacker is obsessed with a specific technology. They learn everything they can about it, sometimes surpassing professionals in their knowledge. However, they have no interest whatsoever in broadening into all the other subjects that one would have to master to become a professional.
@caleb, there are many ways to define hackers, and clearly, many people have defined it differently.
But my original intention was, as you can see in the first few graphs of my story:
....Soja and I were discussing issues concerning cars. I was asking him how the best automotive chip suppliers like Freescale can get a few steps ahead of hackers to identify potential security holes.
Soja quipped: "To protect against attacks, you need to think like attackers."
My conversation with this Freescale executive really opened my eyes. For example, established automotive chip suppliers -- usually a full of smart engineers -- do need help from hackers. So that they can think ahead, figuring out which security holes that need to patch in designing their next generation automotive MCU, for example.
In that context, I would like to know whetther design engineers at chip companies can morph themselves into hackers to help that cause, or they are really two different types of people and they probably need to hire external "hackers" to do the job...
My one comment is that in the course of my studies I had to take class entitled "engineering ethics". I do not think hackers have taken this class. Having said that - yes you need to know your enemies better than you know your friends. In some sense ethics is a cultural issue and can be interpreted based on the culture you come from. Not making any judgements here - just an observation.
Among engineers who can best think like hackers are those who work on testing, he added.
No surprise. A friend who is a test engineer said the fundamental distinction between a developer and a test engineer is that a developer assumes the code will work, while a test engineer assumes the code will fail. Indeed, getting it to fail is what a test engineer does.
"Hackers" in the pejorative sense are behaving like test engineers. The test engineer wants to break the code and discover how and why it broke, so the code can be changed to make that failure impossible. The hacker is looking for failure points where code can be exploited for malicious purposes. The process is similar. The intended end result is very different.
well put, DMcCunney. So you are saying in their pursuit of "failure points" in a system, test engineers and hackers are working with a similar mindset. It makes sense. Do you think, then, it is customary within an engineering organization to leverage their own test engineers in finding any weakness of a system that can be hacked?
Do you think, then, it is customary within an engineering organization to leverage their own test engineers in finding any weakness of a system that can be hacked?
I'm not sure it's deliberately done for that purpose, but crafting code harder to hack will be a side-effect of the test process.
The most common hacker exploit is a buffer overflow. Code manipulates data. It expects to get a certain amount of data, in a certain format, and allocates a chunk of memory to hold the data it's manipulating. What happens if it gets more data than expected? What happens to the excess that won't fit in the buffer? In a buffer overflow exploit, that extra data overflows the buffer, and overwrites some other part of memory. The result may allow the hacker to compromise the system,
Checking for things like buffer overflows in code should be part of the test process. The general rules are "never trust your data", and "never assume that your code can handle every possible condition that might arise when it runs."
Most of the publicized exploits I can think of offhand are in legacy code first written before hacking became common. It simply didn't occur to the developers that someone might deliberately try to overflow a buffer with bad intent. In normal operation a buffer overflow wouldn't happen, so there was no provision to guard against it. Most of the Windows Critical Patches addressed precisely those oversights.
Most of the publicized exploits I can think of offhand are in legacy code first written before hacking became common. It simply didn't occur to the developers that someone mightdeliberately try to overflow a buffer with bad intent.
This is very informative. Now I get it. Thank you.
Note that hacking will still occur, because code will never be perfect. I'm following a couple of developments now where a requirement is that if you submit code, you submit tests that can verify the code as well. But that's not yet the norm, and even when it is, there will be areas of vulnerability. What happens when the vulnerable spot is interactions between pieces of code? Testing all of the possible interactions between sections of code may be an order of magnitude harder than verifying the robustness of any individual module.
All you can realistically do is raise the bar and make it harder to exploit your code, by testing as thoroughly as you can before putting it into production.
Most of the publicized exploits I can think of offhand are in legacy code first written before hacking became common. It simply didn't occur to the developers that someone mightdeliberately try to overflow a buffer with bad intent.
This is very informative. Now I get it. Thank you.
It's all about motif. A Hacker hacks, that is they break the code, to get inside and perform malicious acts. A Test engineer's motivation is to improve the code once the tests show that the code is not fullproof. The only way to get into the mindset of a hacker is think maleciously. Testing the new code of a tested piece of code is much harder. Hence in systems that require close to 100 percent reliability such as space aircrafts redundancy is built in. In security for commercial systems redundancy should also be built in. It is more expensive but might be worth the cost to ensure reliability is met and made harder for hackers to plunder and steal. The current hacker story going around should be analyzed for its flaws. see: http://www.bloomberg.com/news/2013-07-25/5-hackers-charged-in-largest-data-breach-scheme-in-u-s-.html
I've heard a similar comment about needing to think like a hacker to be able to prevent hacking, during a presentation at the Black Hat conference. I think it reflects the feeling that there is a different mindset (attitude, belief, morality, whatever) involved. On the one hand, you're looking at something to figure out how to make it work correctly as intended. On the other hand you're looking at something to figure out how to make it do something unintended.
But I think we need to be careful in generalizing the term "hacker" too far. I see two different kinds of folks who fall under that term. One is the person who creates a kind of "quick and dirty" solution to a problem and the other is someone who tries to break into a system for malicious purposes.
In order to answer your question, then, you need to be sure of which type of hacker you are talking about. I think that hackers of the first type (problem solvers) can easily act as engineers if they learn and use formal methods, and engineers become hackers by bypassing formality. Some folks may be so ingrained in using formal methods that they need to learn how to let go, but I think any engineer can be a hacker in this sense. One or the other mode might feel more natural to someone, though, so to some extent moving between hacker and engineer is a bit like speaking two languages - your native one and one you learn later in life. The degree of fluency someone has in this second language (or engineering mode) will vary from person to person.
The second type of hacking, though, involves a gap that is a lot harder to bridge. This kind of hacking is filled with ego, greed, and malice and getting into a mindset of seeking to destroy, pervert, or circumvent a design for gain or pride (so that you can predict avenues of attack and block them) is a lot harder for folks to get into when their natural inclination is to build, refine, and perfect. This kind of hacking can also be learned, no doubt, but requires a much greater mental shft.
So, which group were you asking about in the blog?
If you need to design a new automotive MCU which is supposed to address the potential hacking issues (i.e. your next car might be attacked by malicious hackers, thus your car might be getting externally controlled somehow), which types of hackers do you need to deal with the issue?
Rather than thinking "that would never happen" (which many engineers said), we need someone who can totally think out of box, and say, "let me hack it," right?
So, in your definition, do that guy belong to the first or second type?
I agree, you need the second type. You want someone who can look at the design and ask "can I do something annoying like honk the horn every time someone turns on the signal blinker," and then diligently dig for some way of doing it. Making the blinker sound the horn is not something most of us would even think of as a problem needing to be solved. Definitely the second kind.
Here's where something like gamification might come in handy. As RichQ and others have noted, it's a mindset thing. In this case what's needed may be more of a gamer mindset than that of an engineer or designer.
The economic aspect (time, money) often raises its head here. Guarding against hacking takes TIME (more testing, more code, more THINKING). I submit that in industry, the question is often will an engineer be allowed to take the time/energy/money to think like a hacker, rather than can the engineer think that way.
Thinking like a hacker (for the purposes of increasing robustness and resistance to "bad" hacking) is also a different aspect of the design process - more like testing, as has been mentioned. Most traditional DESIGN engineering is "make it happen." "Make it robust" is a different criterion, and a different cognitive state (much like writing and editing are different cognitive states).
@LiketoBike, your points are well taken. It's not just a different mindset, but the right question to ask is: Can companies afford to have engineers to think like hackers? (by spending time and money)
But as system's complexity increases and everything connected to Internet, I am guessing it will eventually come down to this: Can any company afford not to have someone who can think like hackers in the future?
junko, you ask: Can any company afford not to have someone who can think like hackers in the future?
And the answer (of course) is: depends. If you're lucky (or your system is boring) you may never be hacked.
Ultimately, management needs to balance the the very real and immediate additional engineering cost of making something more hacker resistant against more nebulous and future benefits such as reduced liability exposure, reputation fortification, avoidance of hacker-induced system failures, and the like.
Unfortunately, the inability to even entertain the thought that someone might want to hack their system tends to create a bias in favor of avoiding expense by doing nothing - a condition that seems to require bitter experience before it can change.
I think we're getting closer to the center of it. An engineer implements what's in the specification: "read some characters from the keyboard," or "convert the sensor voltage to degrees C." That becomes our goal and we implement it in a way that is correct and reliable. What the spec does not say, and what engineers (such as myself) fail to consider, is: "ensure an excess of characters typed cannot be written to the stack where they could form the address of a supervisory process," or perhaps "an excessive reading (say, in degrees F) must not result in rocket engine shut down during flight." Many of us have enough trouble seeing weaknesses in design that are vulnerable to accident, nevermind a determined attack. So indeed, we are not trained to think outside the box; in fact "outside the box" starts to sound cliched, so it might be understating the problem.
Meanwhile, a hacker's very goal (and of course I mean a hacker of the malevolent type) is to achieve the unauthorized and the unanticipated. It's a goal that isn't even opposed to ours - it's on the Z axis when we're watching X and Y.
So then of course we are told we must consider security, and we make the doors and windows tight and bulletproof, but the hackers just come down the chimney. Anyone remember a scene in one of the _Hitchhiker's_ books (the last one, I think) where one of the characters (Ford, probably) gains access to a secure building simply by opening a window? He found himself suspended (I forget how or why) outside one of the higher floors, and reasoned that what the building designers did not expect was for him to be there at all...
Can we engineers be taught to think that way? Probably, but I think we can agree it's not about designing an encryption key with more bits. In fact I wonder if there's any such thing as a course or book anywhere that teaches this kind of lesson!
"A hack" has traditionally meant a "quick and dirty fix." I don't know of any engineer who hasn't had to create his own share of hacks, from time to time. Hopefully, these are temporary fixes that get done right in short order. So a hacker in the more traditional sense is simply someone who tries things "quick and dirty."
Your use of "hacker" is the new one, to describe that annoying vermin that tries to break things (software, in this case). Not much more than a common vandal, but one who works on software.
Yes, it's hopefully difficult to turn engineers into scum. Hopefully. But if defending against software vandalism is posed as one of the design goals, then it should become just another aspect of the design effort. And I'm not surprised that test engineers are the ones called upon to try to break the code, as they try to break any other aspect of a product.
This is a confusing discussion, due to the the two vastly different definitions of "hacker" being used. If we apply the original definition, along the lines of "quick and dirty", then all engineers are also hackers if they have ever implemented something in software or hardware -- usually when pressed for time -- without following a rigorous, disciplined engineering approach. A software hack might be a few lines of code that you added or modified in an ad-hoc manner (maybe trial & error is a better description), which seems to fix a problem you're seeing, but it's not thoroughly tested, it was not the result of rigorous analysis, and it's definitely not production-worthy -- it's just a "hack" to get you past your immediate problem. Similarly a hardware hack might be that 2.2 pF capacitor you added to a PCB trace to slow down a signal and fix a timing problem. You didn't do any calculations or analysis, you just figured you'd try out a few different caps until you found one that did the trick, and that happened to be 2.2 pF.
But the discussion about the security of automotive systems was about the other definition of "hacker" -- the one that should rightly be called "attacker." Yes, technical skills are required, and a person with those skills might or might not be a degreed engineer, but that is hardly the point. The distinguishing characteristics of the attacker are (a) technical skill and (b) malicious intent. Engineers and non-engineers alike can possess either or both of those traits.
For a long time, there have been engineering disciplines that requierd thinking like an attacker. Consider the fields of cryptography, signal jamming and anti-jamming systems and the general subject of electronic countermeasures. The ability to think like an enemy when applying engineering creativity to solve these kinds of problems is not some special state of mind or moral code that "hackers" (attackers) possess and to which "real engineers" are somehow immune.
For a long time, there have been engineering disciplines that requierd thinking like an attacker. Consider the fields of cryptography, signal jamming and anti-jamming systems and the general subject of electronic countermeasures.
Exactly. When the whole world of embedded systems becomes more vulnerable to potentials of external attacks (as more and more systems are designed to get connected to the internet), isn't it about time to have a special cource at an engineering school -- focused on "jpw to think like an attacker"?
If we can teach engineering, we should also be able to teach the art of hacking...
Engineers are interested in how things work, which is not so different from the hacker mentality. If an engineer can reverse-engineer something and figure out how it works, theoretically he/she should be able to gain a good understanding of the vulnerabilities that could be exploited.
It's pretty normal in human language for one word to have shades of meaning, as hacker/hack does. No big deal. But we do have another option: cracker is (or was) the term for the malicious hacker. When I first started working at programming magazine, my editor told me the difference btwn the two words and that programmers were very touchy on the subject. Hackers had a sense of pride and were more like self-appointed test engineers. They were doing offense but to help with your defense. It's a mission of "tough love" for them. It does (sort of) help to have another word to differentiate the two. The hacker vs. cracker as defined here by Chad Perrin in IT Security (2009):
I believe it's still useful to differentiate between hackers and security crackers, though, and that terms like "malicious security cracker" are sufficiently evocative and clear that their use actually helps make communication more effective than the common journalistic misuse of 'hacker'.
Can engineers think like malicious security crackers? Sure they can---the security crack is just like any problem, and engineers solve problems. Putting themselves in that mindset, however, may take a little doing if the system they are cracking is their own. Just like the earlier comment about writing vs. editing: if you write something, you need some time before you can see the flaws in your own work. If you step away from something and come back to it later, you're more likely to see the problems in the system.
Good questions, Junko, and I think Susan is right that there are many shades of meaning to the word (like the difference between analysis and blog in journalism). But most of the hackers I know studied electrical engineering in school, work as engineers, and call themselves hackers. If we're talking about cyber criminals or other malicious people, let's call them what they are. And if a self-educated hacker gains enough knowledge to do some interesting things in code, is that not -- at some level -- engineering?
Do you have to have a degree in engineering to be an "engineer?" And must you lack such a degree to be a "hacker?" What do others think about that...?
There are so many different definitions of "hacker" that IMO it's important to indicate which one you're using. There are seven definitions at Wiktionary, four of them technological. Wikipedia has three definitions, which I'll repeat here:
The first definition includes both "white hat" hackers like Robert Redford in Sneakers (1992) and "black hat" hackers AKA "crackers". There's also the pejorative "hacker", meaning someone do does things haphazardly, like a "hack writer", or a programmer who hacks away at code until it works with a few test cases instead of doing a careful design.
So call me a "hacker" if you like, but please don't ever call me a "hacker" :-)
This is one of the most interesting article I have come across in EEtimes recently. thanks for posting it Junko.
I think Engg and hackers are same species with same mindset just different vision. I have seen engineer act like hackers. In one of a startup I do see when engineers were competing for resources and sys admin was weak they started to hack system to get thier job ahead in competition. I see tasks are like glass half full. Engg focus on half full part and hackers focus on half empty part. Engg try to get more water to fill it and hackers focus only on empty side of glass. I think they can be interchanged. The only thing is engg tend to worry about legal Vs illegal aspect where hacker tend to ignore that part, infact they love being in illegal zone. If we make everything legal then every engg can act like hacker, given bth are asked to do same task.
IMHO there is little that's differentiating hackers, crackers and engineers: the supposition of maliciousness does not necessarily apply only to one of the categories. (There's also the term of the "malicious engineer".)
On the other hand, one thing is common to all of them: to be superior, you have to apply analysis, a systematical approach, sometimes patience. Besides the 'brute force attack' all these methods describe engineers as well as the 'hackers'.
To be honest, from time to time I HAVE to hack something: to access information necessary but denied to me by to lack of documentation, cooperation, incompentence or sheer ignorance.
Anyone rememeber Larry Lange? He was the first EE Times reporter to focus on covering Internet stuff. He interviewed Marc Andreessen in our offices before anyone understood what Netscape was doing.
Larry used to love the Black Hat event. He pointed out there are white and black hat hackers working for good and evil purposes.
Since then it has become clear to me there are grey areas too.
Add this to the mix: Hackathons are events where engineers design stuff. In this context hacking means quickly creating prototypes to test ideas out, bypassing what can be a slow design process at large established companies.
Hmmm. We could consider the Manhattan Project was a big hackathon and, depending on your point of view, the scientists were either malicious hackers or the kind in the white hats. (Of course, that kind of hackathon is not welcome at DESIGN West.)
First, my pet peeve; re: "a hacker is someone who breaks into computer systems stealing information and doing other malicious stuff." That defines a criminal. A hacker may or may not be a criminal, as a doctor, lawyer or anyone may or may not be a criminal, but that definition is of a criminal.
Now to the questions at hand: "can engineers become hackers and hackers become engineers?"
I believe they can. In my mind, the mindset, thought processes and level of creativity are similar. Engineering, however, is supposed to follow a disciplined and calculated process, while hacking tend to take a more "I'm pretty sure" approach.
Most engineers I've met can skip the process and work on intuition if need be. I suspect it's a little more difficult for a hacker to move to engineering simply because It's almost always more difficult to transition from low discipline to high discipline.
In other words, I think the hacker mindset is already in most engineers. It just needs to be let loose. Corporate structure is more likely to be an inhibitor than is the engineering mindset.
Duane: I also agree with your insight that the hacker mindset suffers inside large corporations. One of the things that amazed me about Steve Jobs' leadership at Apple was that he kept that alive as the company grew. I can't think of another innovator (or hacker in this context) who achieved that at another company of comparable size.
This begs the question: Who do you think is the greatest living hacker/innovator/inventor who is in a leadership role within any major company, worldwide?
Junko: "Does anyone here work for a coporate environment in which you are encouraged to let your hair down and think like 'attackers' in your engineering projects?"
Honestly, too much is being made of this. Too narrow of a definition, too much unsubstantiated differentiation of categories of people. Like Duane said, the defintion of "hacker" used in this article is that of a criminal. Not the experimenter or the quick-fixer, as it was previously meant. Engineers can also be criminals, if it comes to that.
Part of good engineering design has always been to make the product as fool-proof, idiot-proof, temperature-stable, voltage variation tolerant, and any other kind of "proof," to make the product as robust as possible, operating in its intended environment, within cost constraints. Defense against criminal attacks has to be included along with all the other defense mechanisms. And of course new pathways for criminals, never mind just plain old bunglers, become possible, the more interconnected a product is.
Engineers have always been taught these things, even if the narrow focus on hacking into a system via its network connections is a relatively new twist. Look at all the security updates you get with Windows OSs. It's an ongoing problem. The more a product is designed for convenience, the more pathways are created that can be abused, the more new measures have to be devised to protect the system from intentional OR unintentional intrusion.
For example, remotely installed software updates in a digital control system are convenient, but create pathways for abuse. EXACTLY THE SAME WAY thet the OBD-II system is convenient, and creates pathways for abuse.
Let's not put too much of a fine point on "what an engineer is" and "what an engineer is not."
no, my point wasn't really about defining what an engineer is and what he is not.
I didn't mean to pigeon hole any of the engineers.
But I was simply responding to the original off-hand comments made by a Freescale executive about automotive security. How are engineers working at those companies (and I am talking about those who have not been necessarily hired as security experts) responding to the rising needs of "thinking like attackers"?
Because engineers and attackers are no different in terms of their ability to think analytically, are they having no problems in playing interchaneable roles?
Or, are some chip companies beginning to hire security experts to find security holes in a system to which they supply their chips?
"Because engineers and attackers are no different in terms of their ability to think analytically, are they having no problems in playing interchaneable roles?"
The way I would put it is simply that network security is a discipline that becomes increasingly important as more things are interconnected. But there's nothing new or different in this. Engineering has always had to deal with innovation. That's what it's all about. When I went to school, Ethernet was just being born and Internet Protocols did not exist yet. Now packet-switched networks and internetworking are a major discipline.
Cybersecurity is a relatively new field just like digital electronics and solid state electronics were new a few decades ago. With cybersecurity, the problem is not that engineers can't think that way. The problem is that it's a constant battle. Then again, what's new about that? Isn't this always the case? E.g., with faster and faster chips, aren't we similarly having to solve and re-solve problems of heat, of pulse rise times, of latency in interconnects? With cybersecurity, you're similarly having to re-solve problems, as new vulnerabilities emerge.
Well said Bert. I don't really understand all the fuss, regardless of which definition of "hacker" is meant. Engineers sometimes fly by the seat of their pants and "hack" quick and dirty solutions to problems, and certainly some engineers are criminals -- or could be if they wanted to be. Ironically, it wasn't that long ago that EE Times had an article about infamous engineer-criminals -- but those guys were violent types, not malicious intruders of networks.
I also like your point about how engineers have always had requirements to make their designs foolproof, temparture-proof, etc. Actually, "proof" is too strong a word -- "resistant" is more accurate. In any case, if your next design happens to included network connectivity, you simply add hacker-resistant to that list.
I totally agree with Bert and Frank on this. The term "hacker" was hijacked by the media some time ago and redefined as someone with malicious intent. But as far as I'm concerned, it's just a slang term for "hacking" code in the same way that "hacks" is sometimes used to (mostly as an insult) to define journalists or marketers who'd do anything for a buck.
In fact, there's a large national meetup group called "hacks and hackers" that includes engineers and journalists who are looking to apply innovative technology to advance journalism. In that context, neither side draws offense at the term, and the group's intent is absolutely positive.
When they are good, an engineer and hacker are exactly the same thing. If differences exist between what a person does and either definition:
then the hacker has not yet reached his potential
or the engineer is having his potential limited by management.
The problem with the specific example given is not that a fundamental difference exists between hackers and engineers; but that anyone that has spent a great deal of time and effort perfecting something will be blind to certain faults. If you just spent a month making a system as secure as you are able, and then you are given a day to "think like a hacker" and try to find ways to circumvent your own security, you will fail to breach your own system because you have already fixed all the exploits you can think of. The solution is to get a fresh set of eyes performing security tests, someone without a vested interest in the success of the device.
Hacking (by my definition) is a lot about understanding how things work and how to make thinge operate in new, different or more efficient ways. Part of that understanding can come through taking things apart.
That leads to a question: Of all of the engineers that you know, how many took things apart when they were kids? Most that I know took apart radios, clocks, televsions, anything with a motor, etc.
If that doesn't show the hacker mindset, I don't know what does.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.