In the 1970s, the U.S. automobile industry thought the rules didn't apply to them. They fretted about the costs of building too much quality into their manufacturing processes. Instead, they depended upon inspection to deliver a quality product. Inspectors identified defects in the cars as they came off the line -- and, then a separate team would fix any issues found during inspection to ready the cars for sale. The resulting quality level? Pretty poor. It turns out that there isn't so much a cost to building quality into the product, as huge costs in not doing so. By focusing on downstream inspection teams for achieving quality, automobile manufacturers were not addressing poor quality at its source, which tainted the entire process. Part builders and assemblers felt no ownership for quality -- and, with another group responsible for identifying and fixing problems, there was no feedback and learning as well.
W. Edwards Deming, the famous quality guru, had a tenet that captured this: "Cease dependence on inspection to achieve quality". Instead of using inspectors to secure quality in the process, you design quality into every step of the process, particularly the early steps like design and build. No longer controversial or seriously questioned, this is now one of the fundamental rules for achieving quality in a manufacturing process. But, chips are different, aren't they? Sure, developing and building chips is a process very similar to manufacturing -- but, you cannot tapeout a chip without verifying it. "Inspection" is critical -- and chip design "depends" on it. So, does that mean that chip design teams can ignore the common wisdom for achieving quality in a manufacturing process? Is there no point in focusing upstream on design quality?
"I Have a Verification Problem, Not a Design Problem"
When discussing the relative value of design quality versus verification effort with others over the years, I've heard comments like "if the design were perfect, I'd still have to write the same number of tests" and "I have a verification problem, not a design problem". It's hard to argue with these sentiments. With the former, no matter how good the design quality is prior to verification, you may "trust" but you must also "verify", requiring you to complete your test plan. With the latter, taking software aside, verification is the most resource heavy, longest pole in chip development. Whatever the source, the problem manifests itself in verification.
Are design quality and chip verification really decoupled? Ignoring the impact of debugging, maybe they were when chip architectures could be validated on the back of a napkin and the software content was negligible. But chip development is no longer just about getting the hardware right. It's now all about validating that you've built the right architecture and delivering a complete system solution, which primarily means getting the software delivered with the silicon. Taking verification aside, a key milestone on the critical path for system validation and software development is achieving functional chip design -- it's not completing your verification test plan. The sooner that your chip design is functional, the sooner it can be used for system validation and software development by running it in emulation or FPGA prototyping. And, for this, design quality is a key lever. How about the following thought experiment: if your design were perfect, would you leverage emulation or FPGA prototyping immediately rather than near the end of verification?
Is there no benefit for chip verification as well? In every chip development that I've been involved in, design quality has been a significant source of project drag during verification integration, throughout verification by requiring considerable effort on debug rather than on test writing or other activities, and by delaying software development and FPGA prototyping.
In manufacturing, design quality is a passion, because it improves the entire development process. In chip development, while verification is still critically important, it's no longer the core focus -- now it's all about system validation and software development. Perhaps it is time to look at design quality as a key lever for improving chip development. Or, perhaps, the rules don't apply to us.
What do you think? Can we improve chip design? Can we build quality in from the start?George Harper
is vice president of marketing at Bluespec, Inc
., where they are extending the boundaries of synthesizable high-level design to include models, test benches & all types of implementations and enabling early emulation for modeling, verification & software development.
If you found this article to be of interest, visit EDA Designline
where you will find the latest and greatest design, technology, product, and news articles with regard to all aspects of Electronic Design Automation (EDA).
Also, you can obtain a highlights update delivered directly to your inbox by signing up for the EDA Designline weekly newsletter – just Click Here
to request this newsletter using the Manage Newsletters tab (if you aren't already a member you'll be asked to register, but it's free and painless so don't let that stop you [grin]).