MONTEREY, Calif.A standard chip design methodology is a necessity for large design teams, but has its share of drawbacks and hurdles, according to Dan Smith, director of hardware engineering for Nvidia Corp., who described his company's evolution towards a standard methodology in a keynote at the Electronic Design Processes (EDP-2002) workshop here.
Debates arose at this year's EDP-2002, a small but influential conference for CAD methodology experts, over system-level design, platform-based design, and RTL versus placement-based sign-off. Smith's keynote on Monday (April 22) set the tone by underlining the advantages, and limitations, of a standard methodology.
Smith described Nvidia's transition from having "no methodology at all" in its early days, to developing a consistent chip design methodology under the control of a dedicated organization. But standardization is no substitute for having qualified designers, Smith warned.
"We started out doing a lot of things wrong, mainly for practical reasons," Smith said. "But we managed to get some large chips out quickly without many bugs at all." Then the company's chip designs got much, much larger. Nvidia's latest graphics processor, for example, has over 60 million transistors.
"It became clear that we needed a standard methodology across platforms, where it makes sense," Smith said. A consistent design methodology reduces the learning curve, provides for better design reuse, helps ensure consistent quality, and avoids instances of "reinventing the wheel," he said.
With Nvidia developed its first graphics processor, Smith said, one designer was assigned to each functional unit. Each designer had a different way of approaching the design, and different scripts were used for each unit. There was no documentation, and sometimes the same tool was modified in many different ways. "People never had time to think things through other than the practicality of getting the unit out," he said.
This diversity caused some problems. For instance, Smith said, over 20 different ways had been used to automate synthesis. That made it very hard to make any global changes, because designers had to go through many scripts to learn how all of the functional units were designed.
Anything that takes engineering time should be standardized, as long as it doesn't hurt the designer, Smith said. Good candidates for standardization include verification languages and models; synthesis, timing, and layout flows; design for test; emulation; and formal verification. A company should also standardize EDA tool selection, Smith said, because "there's no need to buy the same tool twice from different vendors."
As for drawbacks, Smith said standardization can slow the adoption of new techniques. Secondly, it requires a dedicated staff to support it. And different types of chips may require different methodologies, he said.
Picking the right methodology to standardize can be difficult and a decision on standardization can disrupt projects already under way, he said.
To develop a standardized methodology, Smith suggested that programmers develop robust tools and flows that are scalable across projects. Tool experts should work with chip designers and programmers, and companies should plan ahead and be "opportunistic" in deployment.
"Always hire great chip designers and encourage them to innovate," Smith said. "Methodology is never a substitute for intelligent engineers working together."