Breaking News
Design How-To

Prevention, quality and other innovations in hardware debug

7/2/2012 04:08 PM EDT
4 comments
NO RATINGS
More Related Links
View Comments: Newest First | Oldest First | Threaded View
nosnhojn
User Rank
Rookie
re: Prevention, quality and other innovations in hardware debug
nosnhojn   7/5/2012 5:38:13 PM
NO RATINGS
Great! Thanks! I guess I'm so used to seeing skepticism with agile hardware that I'm starting to see it where it isn't :(. I'll have to work on that. -neil

ShashiB
User Rank
Rookie
re: Prevention, quality and other innovations in hardware debug
ShashiB   7/5/2012 5:04:39 PM
NO RATINGS
I am not a skeptic of TDD or Agile. My background is software. I think the Agile/Extreme methods are key not only for hardware designers but also for EDA software tool developers. Great article and info on SVUnit. Thanks.

nosnhojn
User Rank
Rookie
re: Prevention, quality and other innovations in hardware debug
nosnhojn   7/5/2012 3:38:01 PM
NO RATINGS
ShashiB, interesting perspective. Within the hardware community, I think it's pretty common to doubt some software techniques because the target technology is so obviously different so I'm certain you're not the only skeptic out there. I think there's some merit to the argument that software methods won't work for hardware when talking about agile in general (though I don't agree with it)... but the reason I think the potential for TDD is so great is because I think that argument goes out the window. When it comes to code/design quality, I think target technology is irrelevant. Regardless of target - software/hardware/simulation/synthesis/reference model/test tube/whatever - a design that is robust and functionally correct is far more valuable than untested code. Thanks to the mentor graphics study, it's pretty easy to make the point that paying for untested code (i.e. debug) is a very good way to kill your budget. From there, I'd hope that people can see that an aversion to risk is irresponsible when the methods you're using are so obviously inadequate. Like you point out, there's other things to consider. But strictly in terms of quality, I reckon TDD is a good alternative method. Thanks for the comment! -neil

ShashiB
User Rank
Rookie
re: Prevention, quality and other innovations in hardware debug
ShashiB   7/5/2012 2:44:34 PM
NO RATINGS
Traditionally hardware/HDL designers have been wired to write code to direct Synthesis tools to generate certain type of structure. RTL code and simulation is not the end goal. That's where this differs from software where the software product itself is the end goal. The synthesis tools are getting better but still not at par to compilers targeting code to different computer architectures. This leaves a big gap between intention captured in RTL and the final silicon. Treating hardware design as a software project is not going to work until the tools reduce this difference between RTL and silicon. Current mentality seems to be if the designer has to do a whole bunch of tweaking post synthesis why spend time at RTL verification? Exhaustive verification seems to make sense in the final hardware system in the lab. Risky but that seems to be the reality. Then there are the hardware project managers who actually measure the quality of the project or contribution of a design engineer by how early the chip gets into lab (especially for FPGA). This has not helped designers break the pattern, or train and evolve design flows. I want to say the debug time is much more than 32% and the approach taken is to through more bodies at the debug problem. The reason for this is also because of the earlier mentioned gap between RTL and silicon, and risk-averseness with using unproven newer methods.

Top Comments of the Week
August Cartoon Caption Winner!
August Cartoon Caption Winner!
"All the King's horses and all the KIng's men gave up on Humpty, so they handed the problem off to Engineering."
5 comments
Like Us on Facebook

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)
EE Times on Twitter
EE Times Twitter Feed
Radio
LATEST ARCHIVED BROADCAST
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.
Flash Poll