San Francisco -- The effects of process variations loomed large at last week's Design Automation Conference here, as leading-edge chip designers spelled out the challenges they face at 65 nanometers and beyond. But many offered differing views and approaches for dealing with the problem.
One quandary facing chip makers is whether to try to model everything using design-for-manufacturability (DFM) tools, or to employ restricted design rules (RDRs) that will result in more regular fabrics. Discussions last week showed that designers are hoping for a balanced approach that involves some restrictions, but leaves enough area and performance on the table so there's still some point in going to lower process nodes.
Another issue that's unresolved is whether and when to use statistical timing analysis. Some designers are clearly skeptical, believing that it's effective for random variations only. In any case, there seems to be support for separating out systematic variations and modeling those before going to a statistical approach.
But one thing is clear: users who have delved into 65 nm are feeling the pain. "My biggest concern at the 65-, 45- and 32-nm process nodes is variability," said Ho-Kyu Kang, vice president at Samsung Electronics. "Critical design rules have been scaled by 30 percent every other year, but variability has not scaled by the same rule. So variability becomes bigger and bigger as design rules scale."
STMicroelectronics foresees a "discontinuity" with respect to the design tool solutions needed at 45 nm and beyond, said Philippe Magarshack, vice president of central CAD at STMicroelectronics. "We're dealing with restricted design rules on the one hand, and on the other looking for any way we can to predict system variability and account for it in design, rather than with design margins."
Clive Bittlestone, fellow and physical verification manager at Texas Instruments Inc., noted that simple corner analysis with margins is becoming a struggle. "That keeps me awake at night," he said. "It's a key shift." Most design-for-manufacturing effects are served by available tools, but "true" variability analysis and optimization is still needed, he said.
What's most bothersome? Bittlestone showed a list of design concerns at various process nodes. His top concerns at 65 nm are gate shape, design rule checking, models, statistical timing analysis and placement and routing. Critical-area analysis, stress and extraction ranked lower.
Aggressive users often feel shortchanged by commercial EDA vendors. But Magarshack said users can't expect tools too soon. "Until we do designs at these nodes, we can't prioritize the issues and ask for solutions," he said. "There are no tools for 45 nm for us to use--that's obvious. So it's crucial to have good working relationships with the right partners."
In one panel discussion, Wally Rhines, Mentor Graphics Corp. chairman and CEO, asked user representatives why EDA revenues are flat while semiconductor revenues are up. "We have to provide better value every year," replied Gadi Singer, vice president and general manager of Intel Corp.'s low-power IA and technologies group. "The EDA industry has solved enough problems to stay in one place. If you can cut the cost of a project from $50 million to $30 million, we won't have any problem investing in EDA."
There is widespread agreement that design rules will be at least somewhat restricted at 45 nm and below. Keynote speaker Hans Stork, senior vice president and CTO at Texas Instruments, noted in an EE Times video interview that DFM will not be able to model every detail.
"The space is too complex and there are too many options," he said. "We will have to put some rules in place to structure the design."
In a panel discussion on variability, moderator Bill Joyner, director of computer-aided design and test at Semiconductor Research Corp., presented eight hypothetical EDA companies and asked panelists where they'd invest. It's perhaps significant that one of the top choices was a company that offers variation-resistant regular fabrics.
Another consistent choice was a startup offering lithography and process variation modeling. There was far less interest in extraction, placement, routing and yield optimization.
"We shouldn't forget robust design rules and layout policies to minimize variation in the first place," said Vijay Pitchumani, project engineer at Intel.
There were warnings, however. "We've had RDRs for a long time, and they just get more restrictive," said Dennis Buss, vice president for silicon technology development at Texas Instruments. "But you have to be careful. How can you put poly in the same direction or pitch without extensive sacrifices in area?" Tighter design rules, he noted, must still allow designers to achieve smaller areas as they go down in process nodes.
"Regularity is a good thing, but RDRs alone are sort of a reverse scaling approach," said Dennis Sylvester, associate professor of electrical engineering and computer science at the University of Michigan. There are alternatives, he said. One of them derives from the observation that isolated and dense lines show opposite behavior under varying defocus conditions. Mixing isolated and dense cells can compensate for variation with seven times less area penalty than single-pitch RDRs, Sylvester said.
Random, systematic variations
Designers have long noticed that some variations are random and statistical, while others are systematic and can be modeled. Intel's Pitchumani said it's best to model deterministic variations first. This would include device variations such as lithography effects, most interconnect variations including lithography and chemical-metal polishing (CMP) effects and most voltage and temperature variations.
The variability problem, said Riko Radojcic, design-to-silicon initiative director at Qualcomm CDMA Technologies, is really an "accounting" problem. "Process corners are too wide," he said. "We put all sorts of variation into one set of corners."
So what to do? First, he said, "peel out" the systematic effects. If an effect can be modeled, Radojcic said, do so, and then design around it and take it out of the corners. This would include lithography, CMP, orientation, density and on-chip variation.
Second, Radojcic said, don't use corners based on immature processes. "Beat up on the foundries for predictive models of how the corners will be a year from now," he said. "They'll be less pessimistic." Once this is done, designers can focus on truly random, statistical variations, he said.
Radojcic said he expects the design flow to remain pretty much the same, with some added capabilities such as a shape simulator for lithography effects. "If all else fails and I need to deal with statistical variability, I'll go to statistical timing analysis," he said.
"Statistical timing is a good idea so long as you don't assume that variations are statistical," said TI's Buss. About the only thing that's truly statistical, he said, are random dopant fluctuations. With anything else, statistical timing may give the wrong answer, Buss said.
But ST's Magarshack is more positive about statistical timing. "In real life, I believe we cannot accurately predict temperature and voltage variations," he said. "We may have to take a statistical approach to compensate for these unknowns."
So far, however, there's little designer buy-in for statistical timing, observed Sylvester of the University of Michigan. There may be an alternative approach to statistical optimization, he said, that relies on deterministic formulations and "intelligent variation space sampling."
Cost, density, power
Variation isn't the only concern of the power users now moving to 65 and 45 nm. For TI's Buss, the biggest concern is the high cost of custom designs. With design costs around $50 million, he said, "low-volume ASICs will become a thing of the past." The cost is impacted dramatically, he said, by power-management needs, parametric variation and system-on-chip integration, including analog components.
Intel's Singer outlined four major nanometer design challenges. One is increasing density, leading to a huge logic capacity; another is increasing complexity, with technologies such as multiple power domains. A third challenge, the convergence of computing and communications, creates low-power demands. The fourth challenge is time-to-market.
With more analog circuitry being integrated on-chip, analog/mixed-signal design was also a frequently cited concern. In that realm, said ST's Magarshack, "designer productivity has not improved significantly. . . . There's a lot of work for the EDA industry."
See related chart