In one panel discussion, Wally Rhines, Mentor Graphics Corp. chairman and CEO, asked user representatives why EDA revenues are flat while semiconductor revenues are up. "We have to provide better value every year," replied Gadi Singer, vice president and general manager of Intel Corp.'s low-power IA and technologies group. "The EDA industry has solved enough problems to stay in one place. If you can cut the cost of a project from $50 million to $30 million, we won't have any problem investing in EDA."
There is widespread agreement that design rules will be at least somewhat restricted at 45 nm and below. Keynote speaker Hans Stork, senior vice president and CTO at Texas Instruments, noted in an EE Times video interview that DFM will not be able to model every detail.
"The space is too complex and there are too many options," he said. "We will have to put some rules in place to structure the design."
In a panel discussion on variability, moderator Bill Joyner, director of computer-aided design and test at Semiconductor Research Corp., presented eight hypothetical EDA companies and asked panelists where they'd invest. It's perhaps significant that one of the top choices was a company that offers variation-resistant regular fabrics.
Another consistent choice was a startup offering lithography and process variation modeling. There was far less interest in extraction, placement, routing and yield optimization.
"We shouldn't forget robust design rules and layout policies to minimize variation in the first place," said Vijay Pitchumani, project engineer at Intel.
There were warnings, however. "We've had RDRs for a long time, and they just get more restrictive," said Dennis Buss, vice president for silicon technology development at Texas Instruments. "But you have to be careful. How can you put poly in the same direction or pitch without extensive sacrifices in area?" Tighter design rules, he noted, must still allow designers to achieve smaller areas as they go down in process nodes.
"Regularity is a good thing, but RDRs alone are sort of a reverse scaling approach," said Dennis Sylvester, associate professor of electrical engineering and computer science at the University of Michigan. There are alternatives, he said. One of them derives from the observation that isolated and dense lines show opposite behavior under varying defocus conditions. Mixing isolated and dense cells can compensate for variation with seven times less area penalty than single-pitch RDRs, Sylvester said.
Random, systematic variations
Designers have long noticed that some variations are random and statistical, while others are systematic and can be modeled. Intel's Pitchumani said it's best to model deterministic variations first. This would include device variations such as lithography effects, most interconnect variations including lithography and chemical-metal polishing (CMP) effects and most voltage and temperature variations.
The variability problem, said Riko Radojcic, design-to-silicon initiative director at Qualcomm CDMA Technologies, is really an "accounting" problem. "Process corners are too wide," he said. "We put all sorts of variation into one set of corners."
So what to do? First, he said, "peel out" the systematic effects. If an effect can be modeled, Radojcic said, do so, and then design around it and take it out of the corners. This would include lithography, CMP, orientation, density and on-chip variation.
Second, Radojcic said, don't use corners based on immature processes. "Beat up on the foundries for predictive models of how the corners will be a year from now," he said. "They'll be less pessimistic." Once this is done, designers can focus on truly random, statistical variations, he said.
Radojcic said he expects the design flow to remain pretty much the same, with some added capabilities such as a shape simulator for lithography effects. "If all else fails and I need to deal with statistical variability, I'll go to statistical timing analysis," he said.
"Statistical timing is a good idea so long as you don't assume that variations are statistical," said TI's Buss. About the only thing that's truly statistical, he said, are random dopant fluctuations. With anything else, statistical timing may give the wrong answer, Buss said.
But ST's Magarshack is more positive about statistical timing. "In real life, I believe we cannot accurately predict temperature and voltage variations," he said. "We may have to take a statistical approach to compensate for these unknowns."
So far, however, there's little designer buy-in for statistical timing, observed Sylvester of the University of Michigan. There may be an alternative approach to statistical optimization, he said, that relies on deterministic formulations and "intelligent variation space sampling."
Cost, density, power
Variation isn't the only concern of the power users now moving to 65 and 45 nm. For TI's Buss, the biggest concern is the high cost of custom designs. With design costs around $50 million, he said, "low-volume ASICs will become a thing of the past." The cost is impacted dramatically, he said, by power-management needs, parametric variation and system-on-chip integration, including analog components.
Intel's Singer outlined four major nanometer design challenges. One is increasing density, leading to a huge logic capacity; another is increasing complexity, with technologies such as multiple power domains. A third challenge, the convergence of computing and communications, creates low-power demands. The fourth challenge is time-to-market.
With more analog circuitry being integrated on-chip, analog/mixed-signal design was also a frequently cited concern. In that realm, said ST's Magarshack, "designer productivity has not improved significantly. . . . There's a lot of work for the EDA industry."
See related chart