The rapid evolution of IC process technology over the past few decades has brought many changes to the VLSI design process. Skyrocketing chip complexity and collapsing time-to-product windows have driven design teams to become more efficient with their resources. At the same time, shrinking geometries and rising clock frequencies have brought to the forefront a host of new physical and electrical effects that place a premium on integrated analysis and the balancing of multiple design objectives.
Together, these trends are driving IC design teams to maximize their productivity by using the best tools available, whether they are from third-party EDA tool vendors or developed in-house. However, to address the stringent requirements of nanometer processes and still meet shrinking chip development windows, these tools must interact efficiently and seamlessly.
To help CAD development teams meet this challenge, the OpenAccess Coalition has launched an industry-wide effort to provide true open interoperability across IC design tools through an open, standard data access interface (API) and a supporting reference database implementation. This emerging industry standard has the potential to replace the expensive and inefficient process of using file translators to integrate tools from different vendors by employing a standard, in-memory representation that supports efficient data sharing. The OpenAccess initiative promises to dramatically improve IC design productivity by allowing tool integration teams to more easily bring together best-in-class tools and improve the interoperability of those tools.
Formally established in 2001 with the Silicon Integration Initiative (Si2) as its host, the OpenAccess Coalition first released its C++ API specification and the supporting database code in early 2003. CAD teams in a few companies, including HP, have begun to offer OpenAccess-based flows in their production design environment. This paper describes HP's decision to deploy the OpenAccess technology in its baseline Synthesis, Place and Route (SP&R) flow for a leading-edge 90nm chipset design. HP's preliminary experiences with this early deployment project are also described.
Historical HP approach
As a leading developer of high-performance ICs, HP has used an IC design methodology for over two decades that is built around an internally-developed database and toolset. However, in the 1980s, as commercial EDA vendors brought to market a new generation of improved tools, HP began replacing some of its own in-house technology with vendor-supplied applications.
Over the same period, rapid advances in IC process technology and the opportunity these advances have given designers to craft very complex, high-speed ICs, have had a profound impact on the complexity of HP's design environment. Physical and electrical effects, which designers could at one time ignore or avoid through guardbanding, now have a major impact on chip design. To compensate for these effects, designers must perform more intense analysis and tune their designs accordingly, both earlier and more often throughout the design cycle.
Like many companies designing microprocessors and other high-performance ICs, HP has addressed these challenges by incorporating design tools from a growing variety of sources into its flow. Consequently, its design environment has grown increasingly complex. Frequently, tools from different vendors have supported their own distinct file formats for reading and writing data.
Compatibility between tools using different data formats became an increasingly critical obstacle, both to the development of a robust design flow and to attempts to reduce the number of costly design iterations. Industry tool developers have provided some relief for such incompatibilities by adopting industry-standard data file formats to improve the exchange of data between tools.
Still, a system built upon file-based transfers does not generally scale well. The time to parse and load the growing amount of input data is no longer negligible in the runtime of many tools. Furthermore, as data files have grown in size with each IC process generation, it has become increasingly difficult to detect incremental changes in the design from run-to-run or from step-to-step in a design flow.
Figure 1 HP's historial SP&R flow
A file-based transfer system also has a detrimental impact on the cost of building or enhancing a flow. Tool integrators must factor in the cost of integrating a tool into the CAD system, particularly when it doesn't support one of the industry-standard formats. In some cases, the cost of integration can exceed the cost of the tool by a factor of two or three.
Finally, subtle semantic differences between the way various tools interpret data can introduce design bugs into the flow that are difficult to detect and time-consuming to track down. These semantic differences ultimately lead to a less robust flow and expensive debug time. For large designs, even small issues of robustness can cause failures that invalidate many days worth of design time, which translates directly into longer design cycles.
For example, consider a DEF file written from a floorplanning tool and then read into a physical synthesis tool. If the physical synthesis tool does not handle the "Specialnets" section of the DEF description in the same manner as the floorplanning tool, this difference in semantic interpretation may then lead to shorted nets in the synthesis tool. Unfortunately, this error might not be discovered until the design is being routed, and before that discovery is made, considerable work on clock and scan insertion, fill cells, and power routing may have been completed.
All of this work is wasted if the router cannot complete its task because of the "Specialnets" error that occurred much earlier in the design flow. Since a complex flow invariably uses many files in different formats, engineers debugging such a problem must work their way back from the point where the failure in the design was detected in order to identify where it was actually introduced.
HP database requirements for interoperability
Five years ago, HP began an investigation to rework its EDA environment. A primary consideration was the selection of a next-generation database technology that would support HP's tool integration and interoperability requirements. HP engineers considered their own proprietary EDA database, leading commercial databases, and an API standardization effort initiated by Sematech called CHDStd.
One key consideration was wide industry usage, but none of the choices had this potential. More than just a standard API specification is necessary for industry adoption. A reference implementation that was both excellent technologically and available to the entire industry at a low cost was essential. Access to the source code was critical to HP for effective tool and design flow debugging, to protect investments in their database utilities, and to respond quickly to the unanticipated design issues that inevitably crop up in any high-end IC design. HP continues to believe that source code access is also important to the industry's vendors to encourage adoption of the standard.
Based on this demanding set of requirements, HP was very active in the organization and rollout of the OpenAccess Coalition. Before the first public release of OpenAccess technology, HP launched a Beta evaluation of the layout and geometric portion of OpenAccess.
Results from the evaluation proved very positive. HP found that OpenAccess offered performance comparable to the company's proprietary database and a significant improvement in capacity and extensibility. In addition, code quality was high and the API proved extremely consistent in style and was well documented, making it easy to learn and adopt. A follow-on evaluation tested the connectivity space and showed an OpenAccess capacity to scale better for large designs than HP's proprietary database.
Figure 2 Memory usage of HP and OpenAccess databases
Finally, in a test of OpenAccess' portability, the HP team found that they could readily build the source code in their software manufacturing system and target multiple OS/platform/compiler combinations, including Itanium. Throughout the evaluation and development process, the HP team found their counterparts in the OpenAccess Coalition and Cadence Design Systems to be highly responsive to recommendations provided by HP and other Coalition members. Oftentimes, those recommendations resulted in improvements to the technology.
HP's deployment of OpenAccess
Based on these evaluations, HP identified a project to deploy OpenAccess in a production design flow. The SP&R flow was selected as a project that would benefit significantly from OpenAccess. Previously, the greatest obstacle to improved productivity in the flow was the rising number of design iterations. A common database to replace data exchange between tools based on file-formats will allow tighter tool integration and should dramatically reduce the number of design iterations, improve design productivity, and accelerate time-to-product. Furthermore, the HP team felt that the layout automation space offered the most stability and maturity for an initial OpenAccess release.
Figure 3 HP's OpenAccess-based SP&R flow
The HP team chose to deploy OpenAccess on the design of a new, high-frequency 90nm IC with more than 10 million gates. Many of the commercial tools in HP's SP&R flow were already on OpenAccess, which considerably reduced the number of file-format data exchanges necessary in the previous version of the flow. Furthermore, since SP&R represented a central flow in the digital IC design process, it offered an excellent foundation for evolutionary extensions with additional functionality to be made available on subsequent chip designs.
Additionally, HP's CAD team felt that its own productivity could be improved by moving to a higher level of software development. Historically, HP's EDA engineers have retained a high level of visibility into and control of their system infrastructure by using their own in-house developed database, data models, and API. Over the years, that intimate knowledge of their system has proven to be valuable in numerous chip design projects. By having access to OpenAccess source code, the HP team believes that they can retain the same level of responsiveness to the chip design teams, while devoting more of their resources to the key tool development tasks facing next-generation processes.
Resolving implementation issues
One challenge HP faced as the project progressed was building a basic set of utilities on top of OpenAccess. While OpenAccess provides extensive low-level data access, it was valuable to develop a layered set of higher-level utilities for functions, such as query or modification of specific elements of the database. In addition, a Verilog reader and writer to populate the database and interface with tools not available on OpenAccess today had to be developed. Furthermore, HP's mature CAD system offered considerable design and data handling utilities for locating designs and libraries and for tracking dependencies, and thus, many of those utilities had to be ported to the OpenAccess environment.
Some minor issues came up as well. While OpenAccess documentation proved excellent overall, the HP team did note that in some cases, method descriptions were often simple restatements of method names. In a few instances the documentation also lacked practical examples. Another small issue was the inability to view native OpenAccess database objects in a software debugger.
Integration issues with key applications remained a potential issue, as well. Many of the tools not yet integrated with OpenAccess still used text-based interfaces. Also, some OpenAccess binary interfaces to EDA tools still lacked heavy industry usage. As an early adopter, HP had to work to verify those interfaces.
By mid-November 2003, HP completed the first stage of the OpenAccess deployment and released the new digital SP&R flow to the 90nm design team. While reducing the execution time is always important, HP's primary goal is to increase the success rate of each iteration through the SP&R flow.
An iteration is considered "successful" when it results in a design with high electrical quality and meaningful timing data. That resulting design may still contain some timing paths, but it should enable designers to attack remaining design issues in RTL. HP's goal is to increase the iteration success rate from the 50 percent that was common in the previous SP&R system to 80 percent in the new, OpenAccess-based flow.
Over the long term, the HP team plans to use tight integration and intra-tool iterations to achieve design closure more efficiently. HP also hopes to cut the total number of required designer-driven iterations in half by adding applications to the flow, such as repeater optimization and clock distribution.
The success of the OpenAccess SP&R implementation has already led to some unanticipated consequences. As more designers use the flow, designs that are more complex are now producing blocks much larger than in the past. That increased capacity in the SP&R flow, in turn, has begun to stress other parts of the design system where capacity has not yet been enhanced. This unexpected development has already forced HP engineers to plan changes in some back-end tools which traditionally dictate block size. It is also driving HP to more rapidly deploy the OpenAccess implementation to a broader segment of the design flow and to take better advantage of its high capacity.
The new OpenAccess-based design flow promises to deliver multiple benefits to HP chipset design groups. Productivity will rise as the design teams take advantage of tighter tool integration and the consistent interpretation of design data. In addition to a higher percentage of successful SP&R runs and a reduction in the number of design iterations, application and system support will also improve as debugging designs becomes simpler and more efficient. Ultimately, this new environment will allow design teams to take advantage of incremental design practices that will provide the most efficient way to concurrently meet the multiple design constraints in different domains inherent in the development of 90nm ICs.
HP's internal EDA development team is also benefiting significantly from the migration to an OpenAccess-based infrastructure. The ability to integrate third-party tools into the design environment faster, tighter, and with less effort will offer HP's tool integrators the opportunity to maximize software reuse and reduce costs. While HP will still need to invest in infrastructure, tool developers in an OpenAccess-based environment can spend less time working on the lower-level database and its interface with the tools, and devote more of their efforts to higher-level utilities and algorithmic development.
Finally, the use of a standardized, OpenAccess-based environment changes the relationship between internal tool developers and third-party commercial EDA suppliers. For example, development teams inside HP can now more easily leverage the work of commercial EDA suppliers and academia and can more quickly use those advances as extensions of their own efforts. Furthermore, tool integrators at HP will look to commercial EDA suppliers for smaller, finer-grained engines as products that can be applied with greater flexibility to HP's design methodology than larger, more rigid applications.
Next steps for HP
Over the coming year, HP will enhance the SP&R flow and use OpenAccess to improve integration and achieve finer grain control of the design process for the current chip design project. In the near future, they will continue with their evolutionary adoption strategy and extend OpenAccess usage from layout automation to areas such as repeater insertion, parasitic extraction, signal integrity, electrical rule checking, and eventually a variety of other net-centric design and analysis capabilities. Looking ahead, HP anticipates integration with its timing flows, based on the work of the OpenAccess Timing Working Group.
Figure 4 HP's future OpenAccess-based SP&R flow
Additional feasibility studies will determine potential areas of expanded HP deployment of OpenAccess into production flows. Also important will be the availability of commercial tools in these areas. As the team moves more internal tools to OpenAccess, and as more third-party, partner, and university tools become available on OpenAccess, the resulting design system will support still more productive IC design.
One of the key elements contributing to the success of HP's first OpenAccess deployment project is the availability of the OpenAccess reference implementation and its source code. The OpenAccess database is more than just a repository of design information. The database and API were designed to be used directly by the algorithms of many applications.
The ability to directly query and modify the database that vendor tools in the SP&R flow are using allows HP's CAD team to tune the automated flow, as well as to better design and support proprietary applications that work in concert with commercial tools. On future design projects pushing the edge of performance, this capability will be essential to the HP tool development team's ability to refine the methodology of the flow and to provide quality support and fast response to the SP&R users.
OpenAccess also offers a rich set of extensibility features that allow HP's tool developers to extend the data represented by the OpenAccess model without changing the OpenAccess source code itself. Additionally, a complete set of callback functions that are triggered by a variety of actions on the database objects allows further extension of OpenAccess functionality without modifying the OpenAccess source code.
HP was able to tailor the behavior of hierarchical traversal to match existing HP methodologies by using these callback functions. Since HP can fine-tune the information in the database and its behavior without modifying its source, this HP-unique functionality remains in place when each new release of OpenAccess becomes available.
As a high quality database implementation with available source code, OpenAccess provides a key benefit to tool developers across the industry. It allows them to more easily take advantage of the latest advances in university research. Most tools developed today at academic institutions are not written to a standard API. Integrating them into a production CAD system is typically difficult and time consuming, if not prohibitive.
If these same university tools are already written to use OpenAccess, it could dramatically simplify the evaluation and integration process and make cutting-edge research much more accessible to the industry at large. It could also enable university projects to access leading-edge designs for testing their research ideas. This has consistently been a problem for academia because leading-edge designs are not readily shared by the system design houses. However, with OpenAccess, university software can be brought into the corporate world for testing directly on this proprietary data that shares the same database API.
Eventually, an OpenAccess-based environment could dramatically change the way commercial EDA tool vendors design, package, and deliver their tools. In addition to supplying a point tool as a large packaged executable, tool vendors may alternatively supply a library with header files that the tool integrator can then access via a well-specified engine-API. The resulting tighter integration of these finer-grained components could significantly improve designer productivity and help design teams achieve higher quality results in fewer iterations.
Tight integration among best-in-class tools enabled by the standard OpenAccess API will support incremental design tuning necessary to address increasingly complex physical and electrical interactions. In turn, this increased incremental control will lead to earlier design closure and improved time-to-product.
CAD teams will be able to maximize their productivity in building tightly integrated tool suites by reducing the time and effort they devote to lower-level infrastructure development and support.
HP's OpenAccess-based SP&R project offers insight into opportunities provided by OpenAccess for in-house and third-party EDA tool developers. Simplifying the task of evaluating and integrating tools allows tool integration teams to more easily leverage best-in-class applications and university research. OpenAccess offers tool developers throughout the industry the chance to focus their efforts on developing the technologies that will shorten the design cycle and more efficiently generate higher quality ICs.
Alva Barney is a CAD expert in HP's Systems and VLSI Technology Division.He has been the lead developer of HP's Synthesis, Place and Route flows for both microprocessor and chipset design teams over the last six years, and is an active participant in OpenAccess Coalition working groups.
Scott Makinen is the EDA Manager for HP's Systems and VLSI Technology Division, responsible for delivering integrated CAD solutions for microprocessor and high-performance chipset design teams. Scott has nearly twenty years of industry experience spanning ASIC design, PC hardware and software development, linear tape devices and CD/DVD product markets.
Rick Ferreri is a CAD architect in HP's Systems and VLSI Technology Division, focused primarily on HP's CAD system and infrastructure architecture for the last six years. Prior to entering HP's VLSI design organization, Rick was a technical lead in HP's OS development organization where he worked for nine years.
Jim Wilmore is a CAD architect in HP's Systems and VLSI Technology Division, focused primarily on database and design management challenges. He has over 25 years of experience in the IC CAD industry working as an application developer, tool integrator and infrastructure architect. Over the past few years, Jim has been HP's primary representative to the industry's OpenAccess Coalition.