Ever since computer-aided electronic design automation tools started to appear on the scene, companies have released newer versions with enhanced capabilities and proclaimed them to be “bigger,” “better,” “faster,” “more accurate,” and so forth. More recently, it has become common to describe a tool as being “the next generation”, often abbreviated to “TNG” in honor of the classic television series Star Trek: The Next Generation. (Created 21 years after the original Star Trek show, Star Trek TNG was set in the 24th century from the year 2364 through 2370 – about 100 years after the original series timeframe.)
But when you fight your way through all of the rhetoric, you come to realize that – in reality – thus far there have been only two main epochs in Electronic Design Automation (EDA). (In this context, the term epoch is understood to refer to a period of time that is characterized by radical changes and memorable developments.) Of particular interest is the fact that we are now at the dawn of the third epoch of EDA.
This paper starts by briefly recollecting what life for electronic hardware design engineers was like before EDA appeared on the scene. Next, key aspects of Epoch #1 and Epoch #2 are considered. Finally, the new capabilities that will define Epoch #3, such as employing sign-off-level tools throughout the entire design and implementation flow, are introduced. As we will see, Magma Design Automation led the second epoch and is now blazing the trail into the third.Life before EDA
Prior to the introduction of EDA, early designs were typically captured as gate/register-level schematics using pencil and paper. Functional verification was performed by gathering a group of engineers together and getting everyone to agree that the logic would indeed perform its desired purpose. Timing analysis was performed by adding the various gate and wire delays associated with each path by hand.
As you may imagine, this form of design was extremely resource-intensive, time-consuming, and prone to error…
The First Epoch (Standards and file exchange)
The 1970s and 1980s saw the development of computer-aided design, analysis, and verifications tools, such as schematic capture, analog and digital simulators, timing analyzers, and so forth. These tools came to be referred to by the “umbrella” name of Electronic Design Automation (EDA).
Although EDA tools cover a wide range of tasks, all of the tools in Epoch #1 shared certain characteristics. First, they each had their own internal database formats and algorithms for things like timing calculations. Second, they quickly became based on industry or de-facto file formats and standards, such as DEF, LEF, GDSII, EDIF, Verilog, and VHDL (the very first tools were based on their own proprietary formats, but this meant that they couldn’t easily communicate with each other, which prompted the adoption of standards). And third, the vast majority of inter-tool communication was performed by passing ASCII text files around.
In addition to being slow and cumbersome, this resulted in a very “sequential” style of design. For example, the schematic capture tool would be used to capture the design and output a netlist. This netlist would then be used as input by the next tool in line, such as a simulator or timing analyzer. Any problems that were detected would have to be addressed by hand – perhaps by opening up the schematic, making a change, generating a new netlist, and so forth. The isolated style of these tools meant that there were limited (if any) productivity-enhancing features such as cross-probing or real-time inter-tool communication like the back-annotation of attributes into the schematic, for example.
The end result was a “mishmash” of EDA tools that communicated with each other via a “spaghetti” of files and standards. Even multiple tools from the same EDA vendor were subject to these problems. Having said this, everything worked up to a point.The Second Epoch (The unified data model)
The silicon structures associated with ASICs in the early 1990s were relatively large, and effects like signal delays were relatively easy to calculate. (For the purposes of these discussions, we will take the term ASIC to embrace ASSP and SoC devices.) As the processes used to create ASICs evolved and structures on the silicon became smaller and smaller, effects that previously were negligible (third or fourth order) started to become noticeable (second order) or critical (first order), and calculations for attributes such as delays, noise, and power consumption became increasingly complex.
Inaccurate estimations made by the front-end tools resulted in a lack of predictability between the logical design and the physical design. In turn, this resulted in noise, power, and timing problems and difficulties in convergence between the implementation engines and their sign-off counterparts. Around this time, delays associated with the on-chip interconnect began to dominate over delays associated with the logic cells, and physical modeling became an issue, which gave rise to physical synthesis technology.
But the biggest change, which introduced the dawn of Epoch #2 in the late 1990s, occurred when Magma Design Automation appeared on the scene. Magma spent many years developing the concept of a Unified Data Model that could be accessed by all of the ASIC design, implementation, and verification engines. As part of this, Magma created new versions of these engines from the ground up, implementing them in such a way that they could take full advantage of the Unified Data Model.
The idea was to “wrap” all of the tools around the Unified Data Model. This not only made the flow faster by removing all data translations between the tools, it also provided a number of essential technical advantages. The memory footprint is reduced because, instead of making multiple copies of the data, the algorithms share the exact same data structures. This approach also allows multiple tools to run concurrently and to cooperate much more closely. Integrated circuit design is a delicate interplay between logic synthesis, placement, routing, extraction, and timing analysis. The Unified Data Model allows these tools to cooperate seamlessly, as opposed to traditional architectures where the tools live in different worlds and run in different processes.
Moreover, physical optimization could be used in the logic design domain, and global routing could be run during placement. As any implementation decisions were made by one engine, the effects of these decisions could be analyzed on-the-fly and the results could be used to optimize or modify the original decision.The Third Epoch (The age of embedded engines)
The Unified Data Model remains as relevant today as it has ever been. The fact that this data model is totally open allows advanced users – including engineers at Magma – to access and manipulate the data so as to support technology-specific and process-specific requirements. It is this level of accessibility and capability that has allowed Magma and its partners to extend the Unified Data Model to meet the requirements of designs targeted at 2.5D and 3D integrated circuits based on silicon interposer and Through-Silicon Via (TSV) technologies. As the industry continues to adopt next-generation process notes, however, new considerations need to be addressed…
One of the legacies of Epoch #2 is that implementation engines such as Parasitic Extraction and Static Timing Analysis (STA) are different to their sign-off counterparts. There are a number of reasons for this, including historical considerations, but the bottom line is that this situation is becoming unacceptable at the latest technology nodes (28nm, 22nm, and 20nm; the soon-to-be 14nm node, and below).
If something isn’t done to address this problem, lack of correlation between the implementation engines and the sign-off engines will lead to unacceptable delays in the design cycle – perhaps even leading to failures in design closure or – in a “best-case” scenario –adding unnecessary margins resulting in lower yield, reduced performance, larger die sizes, and increased power consumption.
The solution is to employ sign-off-level tools throughout the entire design and implementation flow. However, the way in which tools are used in Epoch #2 level design environments typically involves their being invoked on a case-by-case basis and data being passed back and forth via an Application Programming Interface (API). Since implementation engines invoke timing analysis millions upon millions of times, and since each data analysis run in sign-off-level tools miscorrelates to that of the implementation engines, the end results are unacceptable overheads and delays.
The solution is to move to Epoch #3, which involves the use of advanced versions of sign-off-level engines that are fast enough to be deployed throughout the entire design flow and that are deeply embedded into the implementation environment for sign-off-accurate analysis during physical optimization.
Before we proceed, let’s first remind ourselves that Magma’s Talus is the physical design environment of choice for engineers creating complex ASICs, ASSPs, and SoCs at all process nodes where performance and power management are crucial. Magma’s Tekton is a state-of-the-art STA tool that provides groundbreaking multi-mode/multi-corner performance, offers full support for crosstalk analysis and AOCV, and performs timing calculations on tens of millions of instances in minutes. Magma’s FineSim is a high-performance, high-capacity SPICE engine. And Magma’s QCP (which is based on Magma’s industry-leading 3D field solver, QuickCap) is a cutting-edge parasitic extraction tool.
We are currently just at the beginning of Epoch #3, but already Tekton has been embedded into Talus, where it can be used throughout the design, analysis, and implementation process by the appropriate engines. Meanwhile, FineSim has been embedded into SiliconSmart to ensure the highest accuracy with regard to Standard Cell and I/O Cell characterization for multiple PVTs (processes, voltages, and temperatures). Most recently, FineSim has also been embedded into Tekton for use when extreme accuracy is required, such as performing analysis on the high-performance clock mesh.
It is exciting to be able to be present at the dawn of a new epoch in EDA, and it will be interesting to observe developments in the coming years. About the author
Behrooz Zahiri is vice president of marketing at Magma, responsible for corporate business strategy and market solutions. He returned to Magma in 2011after three years working outside the EDA industry. Prior to that, Zahiri was vice president of product marketing for Magma’s flagship RTL-to-GDSII product lines. Before joining Magma Zahiri served as Actel’s director of marketing for FPGA software products, EDA business development, application engineering and customer support. He has also held various engineering positions at Intel Corporation. The author of numerous IC industry articles, Zahiri has 17 years of experience in computer and IC chip design. He holds a master’s of science degree in Electrical Engineering from Stanford University and a bachelor’s degree in Electrical Engineering and Computer Science from University of California, Berkeley.
If you found this article to be of interest, visit EDA Designline
where you will find the latest and greatest design, technology, product, and news articles with regard to all aspects of Electronic Design Automation (EDA).
Also, you can obtain a highlights update delivered directly to your inbox by signing up for the EDA Designline weekly newsletter – just Click Here
to request this newsletter using the Manage Newsletters tab (if you aren't already a member you'll be asked to register, but it's free and painless so don't let that stop you [grin]).