Designers have brought designs as low as 0.18 micron to tapeout without signal integrity closure, although at the expense of performance. However, below 0.13 micron the entire landscape changes; there, skipping signal integrity analysis is no longer an option. In addition to signal integrity problems, designers must grapple with such challenges as power integrity (or IR drop), electromigration, hot carrier injection, negative biased temperature instability (NBTI) and copper process variation. They must assess, analyze and deal with these problems before signing off on a tapeout.
One reason for the urgency is that mask sets that used to cost $300,000 to $500,000 now cost well over $1 million. That's because of new process complexities at the 0.13-micron level, including higher resolution, more metal layers and the extra mask sets required to deal with copper process needs. And the move from 200-mm to 300-mm wafers has also added to the cost of mask sets. So to avoid a million-dollar mistake, designers must understand these changes and adapt their tools and methodologies accordingly.
Lower voltage supply, higher coupling
As the design shrinks, the larger aspect ratio and reduced wire spacing create more coupling noise. Meanwhile, lowering the voltage supply reduces the circuit noise margin accordingly. This increases signal integrity problems significantly, magnifying the need for enhanced accuracy and greater capacity to process more coupling effects. Today's signal integrity tools have to address both the accuracy and capacity to help shrink guardbands so that circuit performance can be maximized. Accurate cell noise characterization filters millions of noise reports to focus on those that really affect change in cell timing delay or functional failure.
With today's chips, the power line runs virtually all around the chip. As the current runs through this long wire, the voltage drops. In earlier designs, which included a 3-V power source and a shorter interconnect, a drop in power of a few hundred millivolts would be insignificant. However, in today's new low-power designs, such a drop can be fatal as it may cross the threshold of acceptable tolerances required for operation at optimum speeds. With changed supply-source levels like this, logic cells will no longer operate to design specifications, especially in the pull-up swing. Ultimately, the design will fail.
Power-analysis tools have been available for years, but they are not up to the challenges presented by this new crop of design complexities. Today's tools should be able not only to identify where power is dropping a bit too much in a relative meaning, but also to identify precisely if that particular drop would be a problem for that particular device. The tools should give insight into the impact of voltage drop on the functional operation and on-chip timing.
To achieve this goal, accurate device power calculation and power-grid analysis is a must. This also reduces the possibility for voltage on a wire to dip below zero, which is physically unrealistic. Another important part of the tool will be an accurately characterized library to provide useful information for device voltage-drop tolerance and timing changes caused by voltage drop.
Fortunately, electromigration is not as much of a problem as it used to be because copper is more resistant to this failure. However, as designers push for maximum performance and smaller die sizes, it will always be a design constraint. Another important concern is the capability of transient power-grid analysis, which should be able to catch peak-power integrity problems by considering the decoupling capacitance and inductance effect in high-frequency designs. Thus, design houses considering the leap to 0.13-micron processes must carefully weigh the pros and cons of various tools before making a commitment.
Copper causes new problems
The increased process variation of 0.13-micron copper will become more and more of a concern, as it adds another uncertainty for circuit performance and yield. Some major concerns relating to process variation include:
- Metal thickness variation due to copper chemical mechanical polishing (CMP) and effects of local metal line spacing, width and density.
- Interlayer dielectric thickness variation due to ILD CMP and effects of local underlying metal patterns.
- Width dependency of copper line resistivity.
Process variation is not only process dependent. More important, it is also design dependent. CMP can result in as much as 20 percent variation in the thickness of metal and interlayer dielectric layers, a variation that will greatly affect the coupling capacitance. Incorporating process-variation capability into layout parasitic extraction tools will be critical for accurate coupling capacitance extraction and consequently the downstream signal integrity analysis.
One complexity of the new processes is that the copper tends to diffuse into the silicon unless a barrier metal is sandwiched between them. To accommodate copper and the impact of shrinking geometries, the metal is now in the shape of a trapezoid as opposed to the previous rectilinear configuration. Layout parasitic extraction tools must now be able to handle shapes like this efficiently and accurately.
Hot carrier injection
As IC processes continue to migrate to smaller geometries, problems such as hot-carrier injection and negative-biased temperature instability surface, requiring design tool modifications or process enhancements. As the design shrinks, the electric field in the channel appears larger and more electrons become energized, or "hot." Some of these electrons will damage the channel-oxide interface and lead to circuit performance degradation, a phenomenon referred to as hot carrier injection.
A high vertical electrical field at a high temperature for tox < 50="" (5="" nm)="" causes="" another="" reliability="" problem:="" nbti.="" this="" is="" a="" very="" serious="" problem,="" especially="" with="" gate-insulator="" materials="" for="" ic="" process="" geometries="" at="" and="" below="" 0.13="" micron.="" the="" results="" are="" an="" unacceptably="" high="" failure="" rate="" in="" chips="" during="" burn-in="" testing.="" nbti="" appears="" only="" in="" pmos="" devices="" and="" there="" is="" no="" known="" process="" solution="" to="">
NBTI is expected to become an even greater concern as geometries continue to shrink. Thus, designers need tools to help identify potentially troubling design areas where NBTI can cause problems. NBTI is a critical concern for designers looking to achieve maximum performance. It is likely that anyone who is designing at or below 0.13 micron will have to adopt a methodology to address these reliability problems, as performance and yield are usually the top two priorities for any state-of-the-art design.
IC fabricators are beginning to publicly admit they might be experiencing one or more of these reliability problems. Not surprisingly, IC manufacturers are concerned about discussing such weaknesses in their IC processes, but these new problems are so universal that many foundry sources are now openly addressing the problems. Industry leaders are installing design-tool solutions to assist their customers in achieving the best possible IC design capabilities from their processes.
Hierarchical physical analysis
The rapid progress of very deep submicron technology has made large-scale SoC design possible. Usually such a design will have millions of devices on a chip, and it is not easy for today's EDA tools to handle such large designs in flat mode. Using the hierarchical features of a design will definitely help. However, on-chip coupling capacitances can destroy design hierarchy. The EDA industry has to provide hierarchical physical-analysis tools that consider coupling capacitance.
The traditional signal integrity noise analysis approach to getting a design to closure cannot succeed, evidenced by the fact that today's largest designs are failing tapeout at a rate of 50 percent or more. The technological problems surrounding the 0.13-micron process make this a very expensive statistic. Today's IC designers have to be aware of these problems before starting a design and must be assured that the newly emerging generation of tools developed to address these complexities are ready and able to work effectively at the most fundamental level of design.