Breaking News
Comments
Newest First | Oldest First | Threaded View
<<   <   Page 4 / 5   >   >>
LiketoBike
User Rank
CEO
Engineering/hacking differences
LiketoBike   7/26/2013 2:33:58 PM
NO RATINGS
The economic aspect (time, money) often raises its head here.  Guarding against hacking takes TIME (more testing, more code, more THINKING).  I submit that in industry, the question is often will an engineer be allowed to take the time/energy/money to think like a hacker, rather than can the engineer think that way. 

Thinking like a hacker (for the purposes of increasing robustness and resistance to "bad" hacking) is also a different aspect of the design process - more like testing, as has been mentioned.  Most traditional DESIGN engineering is "make it happen."  "Make it robust" is a different criterion, and a different cognitive state (much like writing and editing are different cognitive states).  

RichQ
User Rank
CEO
Re: Think like a hacker
RichQ   7/26/2013 2:17:43 PM
NO RATINGS
I agree, you need the second type. You want someone who can look at the design and ask "can I do something annoying like honk the horn every time someone turns on the signal blinker," and then diligently dig for some way of doing it. Making the blinker sound the horn is not something most of us would even think of as a problem needing to be solved. Definitely the second kind.

DMcCunney
User Rank
CEO
Re: Hackers and test engineers
DMcCunney   7/26/2013 2:13:12 PM
NO RATINGS
You're welcome.

Note that hacking will still occur, because code will never be perfect.  I'm following a couple of developments now where a requirement is that if you submit code, you submit tests that can verify the code as well.  But that's not yet the norm, and even when it is, there will be areas of vulnerability.  What happens when the vulnerable spot is interactions between pieces of code?  Testing all of the possible interactions between sections of code may be an order of magnitude harder than verifying the robustness of any individual module.

All you can realistically do is raise the bar and make it harder to exploit your code, by testing as thoroughly as you can before putting it into production.

junko.yoshida
User Rank
Blogger
Re: Hackers and test engineers
junko.yoshida   7/26/2013 2:04:32 PM
NO RATINGS
Most of the publicized exploits I can think of offhand are in legacy code first written before hacking became common.  It simply didn't occur to the developers that someone mightdeliberately try to overflow a buffer with bad intent.

This is very informative. Now I get it. Thank you.

junko.yoshida
User Rank
Blogger
Re: Hackers and test engineers
junko.yoshida   7/26/2013 2:04:31 PM
NO RATINGS
Most of the publicized exploits I can think of offhand are in legacy code first written before hacking became common.  It simply didn't occur to the developers that someone mightdeliberately try to overflow a buffer with bad intent.

This is very informative. Now I get it. Thank you.

junko.yoshida
User Rank
Blogger
Re: Think like a hacker
junko.yoshida   7/26/2013 2:00:31 PM
NO RATINGS
RichQ, thanks for your detailed description here. 

If you need to design a new automotive MCU which is supposed to address the potential hacking issues (i.e. your next car might be attacked by malicious hackers, thus your car might be getting externally controlled somehow), which types of hackers do you need to deal with the issue?

Rather than thinking "that would never happen" (which many engineers said), we need someone who can totally think out of box, and say, "let me hack it," right?

So, in your definition, do that guy belong to the first or second type?

I would say the second type. 

 

 

RichQ
User Rank
CEO
Think like a hacker
RichQ   7/26/2013 1:26:15 PM
NO RATINGS
I've heard a similar comment about needing to think like a hacker to be able to prevent hacking, during a presentation at the Black Hat conference. I think it reflects the feeling that there is a different mindset (attitude, belief, morality, whatever) involved. On the one hand, you're looking at something to figure out how to make it work correctly as intended. On the other hand you're looking at something to figure out how to make it do something unintended.

But I think we need to be careful in generalizing the term "hacker" too far. I see two different kinds of folks who fall under that term. One is the person who creates a kind of "quick and dirty" solution to a problem and the other is someone who tries to break into a system for malicious purposes.

In order to answer your question, then, you need to be sure of which type of hacker you are talking about. I think that hackers of the first type (problem solvers) can easily act as engineers if they learn and use formal methods, and engineers become hackers by bypassing formality. Some folks may be so ingrained in using formal methods that they need to learn how to let go, but I think any engineer can be a hacker in this sense. One or the other mode might feel more natural to someone, though, so to some extent moving between hacker and engineer is a bit like speaking two languages - your native one and one you learn later in life. The degree of fluency someone has in this second language (or engineering mode) will vary from person to person.

The second type of hacking, though, involves a gap that is a lot harder to bridge. This kind of hacking is filled with ego, greed, and malice and getting into a mindset of seeking to destroy, pervert, or circumvent a design for gain or pride (so that you can predict avenues of attack and block them) is a lot harder for folks to get into when their natural inclination is to build, refine, and perfect. This kind of hacking can also be learned, no doubt, but requires a much greater mental shft.

So, which group were you asking about in the blog?

DMcCunney
User Rank
CEO
Re: Hackers and test engineers
DMcCunney   7/26/2013 12:58:28 PM
NO RATINGS
Do you think, then, it is customary within an engineering organization to leverage their own test engineers in finding any weakness of a system that can be hacked?

I'm not sure it's deliberately done for that purpose, but crafting code harder to hack will be a side-effect of the test process.

The most common hacker exploit is a buffer overflow.  Code manipulates data.  It expects to get a certain amount of data, in a certain format, and allocates a chunk of memory to hold the data it's manipulating.  What happens if it gets more data than expected?  What happens to the excess that won't fit in the buffer?  In a buffer overflow exploit, that extra data overflows the buffer, and overwrites some other part of memory.  The result may allow the hacker to compromise the system,

Checking for things like buffer overflows in code should be part of the test process. The general rules are "never trust your data", and "never assume that your code can handle every possible condition that might arise when it runs."

Most of the publicized exploits I can think of offhand are in legacy code first written before hacking became common.  It simply didn't occur to the developers that someone might deliberately try to overflow a buffer with bad intent.  In normal operation a buffer overflow wouldn't happen, so there was no provision to guard against it.  Most of the Windows Critical Patches addressed precisely those oversights.

junko.yoshida
User Rank
Blogger
Re: Motif
junko.yoshida   7/26/2013 12:56:09 PM
NO RATINGS
That's a good point. 

But there are hackers now actually being hired and paid well by governments -- both here in the U.S. and abraod. Their mission is to find computer flaws:

http://www.nytimes.com/2013/07/14/world/europe/nations-buying-as-hackers-sell-computer-flaws.html?pagewanted=all

I am curious if automotive companies are also actually hiring hackers to do the same. 

mohov0
User Rank
Rookie
Motif
mohov0   7/26/2013 12:50:10 PM
NO RATINGS
It's all about motif. A Hacker hacks, that is they break the code, to get inside and perform malicious acts. A Test engineer's motivation is to improve the code once the tests show that the code is not fullproof. The only way to get into the mindset of a hacker is think maleciously. Testing the new code of a tested piece of code is much harder. Hence in systems that require close to 100 percent reliability such as space aircrafts redundancy is built in. In security for commercial systems redundancy should also be built in. It is more expensive but might be worth the cost to ensure reliability is met and made harder for hackers to plunder and steal. The current hacker story going around should be analyzed for its flaws. see: http://www.bloomberg.com/news/2013-07-25/5-hackers-charged-in-largest-data-breach-scheme-in-u-s-.html

<<   <   Page 4 / 5   >   >>


EE Life
Frankenstein's Fix, Teardowns, Sideshows, Design Contests, Reader Content & More
Max Maxfield

Oh, No! My Antique Analog Meter Has Twitched Its Last
Max Maxfield
6 comments
Well, life is certainly full of ups and downs, isn't it? When it comes to the antique analog meters I'm using in a number of my hobby projects, things appeared to be going swimmingly well, ...

EDN Staff

11 Summer Vacation Spots for Engineers
EDN Staff
20 comments
This collection of places from technology history, museums, and modern marvels is a roadmap for an engineering adventure that will take you around the world. Here are just a few spots ...

Glen Chenier

Engineers Solve Analog/Digital Problem, Invent Creative Expletives
Glen Chenier
15 comments
- An analog engineer and a digital engineer join forces, use their respective skills, and pull a few bunnies out of a hat to troubleshoot a system with which they are completely ...

Larry Desjardin

Engineers Should Study Finance: 5 Reasons Why
Larry Desjardin
46 comments
I'm a big proponent of engineers learning financial basics. Why? Because engineers are making decisions all the time, in multiple ways. Having a good financial understanding guides these ...

Flash Poll
Top Comments of the Week
Like Us on Facebook
EE Times on Twitter
EE Times Twitter Feed

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)