Vlad Marchuk, CTO and founder of PolytEDA Software Corp., discusses the changes in design tools and techniques for physical design.
The semiconductor industry continues to shrink process technologies to geometries way below the wavelength of the light source used to create the patterns on a wafer. Starting at geometries such as 40nm, designers who never had to think about semiconductor manufacturing process-related issues were forced to consider complex physics, in addition to the geometric growth in data size during design implementation. That is a lot of learning to do in a short time for designers, and the pressure will continue. 28nm technology is around the corner and 20nm process will soon follow.
As the semiconductor industry is charging forward with its process technologies, the electronic design automation (EDA) industry is trying to keep in step with the process advances. While EDA tools have done a reasonably good job of keeping up, the recent trends in process technologies have created new needs, for instance, the need to move from a compute farm to a compute “ranch” for physical verification. And still runtimes take several hours, or even days. Most of these tools were architected using algorithms and concepts that were developed in the 1990s (some even in the 1980s) and are unable to meet the runtime and scalability needs for the advanced process technologies, today and tomorrow.
At 65nm, designs came close to 2 billion transistors. At 40nm, the number of transistors increased to several billion, which further challenged EDA tools, especially physical verification tools. Given that scalability is a critical requirement for a designer, the ability to comfortably handle tens of billions of transistors with a reasonable turnaround time is a must for physical verification tools.
Besides the data volume, increasing complexity of design rules with each new process node is a major concern, often leading to over- and under-checking in the physical verification of a layout. For example, when verifying a layout at 28nm, or even 40nm, rule checks must be always done in the context of the surroundings. A design norm can have different values for the same shape in the same layer depending on the shapes (same or different layer) that are nearby. These are context-sensitive rules requiring context-sensitive checking as well. Such rules did not exist in the 1990s, when tools were architected to include the rule coding languages. However, over the years EDA vendors successfully retrofitted tools with features on top of the original architecture that continued to meet the design verification needs. Figure 1 illustrates the evolution of physical verification technology and tools.
Figure 1: Evolution of physical verification tools
It is clear that current design rule check/ layout versus schematic (DRC/LVS) tools are like huge multi-story buildings sitting on a foundation that needs a major overhaul, just to meet the needs of the next generation in utilization. That is not easy for the building owners to execute without causing major disruption in service. The same is true of the DRC/LVS tools in use today. The move from “flat” to “hierarchical” processing yielded major improvements in productivity until the need for multi-CPU/host architectures came along with more advanced process nodes. However, with today’s leading-edge process technologies, factors like context-sensitive proximity effects and metal fills play such a strong role that in most cases, the individual instances of a hierarchical structure must be analyzed separately. This raises some important questions: except for highly structured layouts like memory, what is the value of hierarchical DRC processing for all other layout styles? Are the current tools capable of handling the needs of process technologies like 28nm and beyond?
About the author:
Vlad Marchuk, CTO and founder, PolytEDA Software Corp.
Marchuk has more than 20 years of experience in the EDA industry. He co-founded OTTO Software (acquired by CDN 2003), which developed a physical verification system for IC circuits and has held various positions at Cadence and Electronics Workbench. He graduated from Kiev Polytechnic University in 1988 with a Master of Science degree in CAD engineering.