LOS ANGELES Working at the "bleeding edge" of chip design leads to a number of challenges with EDA tools and methodologies, according to presenters at a Design Automation Conference. At a panel on chip design case studies, Philips Semiconductors, Nvidia Corp., and Xilinx Inc. detailed their experiences with extremely aggressive chip designs.
Moderator John Cohn, senior technical staff member at IBM Corp., said the panel was convened to examine chips that were "too wild, too big, too critical, and too weird to fit into traditional ASIC design methodologies, but did not have huge design teams or large budgets."
The first chip discussed was the recently-completed NX-2700 digital TV processor from Philips Semiconductor. Santanu Dutta, VLSI logic design manager for Philips' DTV business line (Sunnyvale, Calif.), said that "10 designers bled for about a year" to complete the chip.
The DTV processor takes high-density digital video and audio in, and sends analog output to televisions and VCRs. It has a high-performance internal bus linking such blocks as an MPEG-2 decoder, VLIW CPU core, high-density video output, and various interface blocks, including a PCI core.
The chip has 14 clock domains, a hierarchical clock tree, and a debugging mechanism that calls for daisy-chaining peripheral units on the chip. It was fabricated in 0.25-micron technology with five levels of metal. Dutta said there were 18 million devices on the chip.
Among the tools used by Dutta's group were Cadence's Verilog-XL simulator, Synopsys' Design Compiler synthesizer, Avanti's Design Verifyer equivalency checker, Quickturn emulation, and Cadence placement and routing. Philips used its internal tools for scan insertion and automatic test pattern generation.
Philips' verification approach divided the chip into more accessible "chiplets" that could be attacked by individual engineers, Dutta said. Philips engineers wrote C language or assembly programs, compiled them, and executed instructions on a Verilog model of the chip. A C-based, cycle-accurate simulator was used in concert with Verilog simulation.
Time for change
The chip design provided a learning experience. Dutta said the team underestimated the amount of disk space needed for design data; found that regression testing chewed up more compute power than expected; and had some difficulty compiling net-lists for emulation. When the silicon came back, its power dissipation was too high and it had timing problems. Designers had to change the spec to lower the chip's clock speed and slightly raise the power dissipation.
Dutta said his group needs better tools for signal integrity analysis. "How can I trust the numbers that come out of these tools? I've done so and suffered in the past," he said. He also said run-times for some tools can be days or even weeks, and that user interfaces need work in areas such as formal verification.
Panelist Chris Malachowsky, vice president of engineering at Nvidia (Santa Clara, Calif.), a graphics processor vendor, talk about general chip design problems rather than about any single chip. He noted that Nvidia is now working on designs with 25 million and 50 million transistors respectively, and is planning a graphics processor unit with 65 million transistors for Microsoft's X-Box game platform. All of these chips have extremely short design cycles, he said.
Nvidia's design methodology includes the development of "transaction-accurate" C language models. These models are compared against RTL models, and the C model is not considered "golden" until this comparison is validated. Designers can write tests at any level, and reuse them at different levels of abstraction.
In the design capture realm, Malachowsky said that the Unix emacs and vi editors are "my best CAD tools." He praised Synopsys' Module Compiler, but noted that his biggest challenge is control logic, for which there is currently little automation.
Malachowsky said that verification now takes 80 percent of the design cycle, and said that Nvidia has "way more" people on verification than on design. "Verification takes too long and is too hard, but we understand it," he said. "Simulation is still my best tool, which is sad." Emulation is "great," he said, but is costly and difficult to use. Timing analysis, he said, is difficult to use when multiple clock domains are involved.
In the physical design area, Malachowsky said, most of the problems are in placement. "Routing is well understood, but with a bad placement, routing can't overcome it," he said. Physical partitioning is not yet automated, hierarchical design is not well supported, and most tools require a design rule check (DRC), he said. "It annoys me that I have to run a DRC to go fix problems created by the tools themselves," he said.
One of the biggest problems identified by Malachowsky is the lack of coordination between chip, board, and package design. There are good tools in all three areas, he said, but nothing that models the interaction of all three. Other problem areas noted by Malachowsky include pre-design area estimation, bandwidth analysis, power modeling, and tool documentation.
Panelist Steve Young, an architect of the Virtex FPGA family of Xilinx (San Jose, Calif.), made part of his presentation an advocacy of FPGAs. But he surprised many in the audience when he said he expects the number of transistors on a single FPGA to reach 500 million in 2001, with the Xilinx Virtex-II family.
Engineers designing FPGA silicon for companies such as Xilinx are coping with some of the most difficult deep-submicron devices around, Young said. "We face a number of problems so you don't have to," he said. These include simultaneous switching, deep-submicron parasitics, fault coverage, and clock skew management.
"Like all chip designers, we are customers of EDA tools," said Young. "And we are challenging them." Young said that commercial EDA tools just aren't designed for the size of devices Xilinx engineers are building. In particular, he said, Xilinx needs tools that can handle larger HDL simulations and circuit simulations.
Presentations from the panel will be available at the DAC Web site within the next few weeks.