SANTA CLARA, Calif. There is very little that can be done about chip failures other than to design chips correctly in the first place concluded a panel Wednesday (Feb. 8) at DesignCon 2006 here.
Six panelists representing chip designers, verification engineers, design tool suppliers, intellectual property (IP) providers and design consultants debated the degree of failure that can be mitigated by applying cleaner methodologies for designing chips.
"Some chips will always fail, but it’s getting too expensive to take multidimensional risks all the time,” said Grant Martin, chief scientist at Tensilica. It might be time to change the paradigm to instill in designers the need to use system-level design methodologies, “because the complexity of the chips demands that. What’s more, verification can’t catch the problem if the system isn’t well architected and specified,” said Martin
Citing a study by consulting firm Collett International that found that 57 percent of chips fail on first pass, Ira Chayut, verification manager at InVidia, also blamed complexity as the culprit of failures.
“People don’t kill chips, complexity kills chips,” Chayut said.
Chayut called for a way to start using all the transistors available to the designer, even considering implementing redundant architectural designs that would double up on the paths and in case of failures take over operation of the chip.
“Maybe what we need," Chayut said, "is someone on the design team designated as the ‘architect of verification’ whose responsibilities complement those set by the design-for-test and design-for-volume guys.”
“There is no silver bullet for verification,” said Thomas Anderson, director of technical marketing at Synopsys. “A proven methodology ties it all together.”
Anderson advocated early application of block level verification in the design, where a lot of bugs could have been caught.
“Static analysis tools are the best way to address this,” said Anderson. He cited the use of assertions and constraint-random stimulus generation as some of the methods that need to be applied in today’s complex IC designs.
Everybody agreed that functional verification needs to be applied vigorously since functional failures make or break a chip’s operation.
“Many chips work well enough for them to work in the system,” said moderator Brian Bailey of Bailey Consulting. “Functional verification ensures that."
Russ Vreeland, senior principle verification engineer at Broadcom, agreed that functional verification can live up to its expectations if the certain design criteria are applied to designs. He called for designers to standardize on deliverable IP, implement well-defined protocols to design simple interfaces, and choose one common test bench environment that has been proven to work.
Emulation/acceleration should be applied when necessary to “get into the box” as much as possible, Vreeland said, and both directed and randomized testing should be applied.
Designers should “eschew complexity for the sake of complexity," Vreeland suggested.
“Hire some philosophers scientists who appreciate simplicity," Vreeland said. "Engineers love complexity too much and as a result complexity is running amok.”