In the week between Christmas and New Year, a semiconductor equipment company purchased an EDA company, and an EDA company increased its ESL expertise with the acquisition of a British company. And both events have given pundits cause for judgments, predictions, and the usual doomsday scenario. I thought the combined news exciting and full of promise.
In the week between Christmas and New Year, a semiconductor equipment company purchased an EDA company, and an EDA company increased its ESL expertise with the acquisition of a British company. And both events have given pundits cause for judgments, predictions, and the usual doomsday scenario. I thought the combined news exciting and full of promise. Let's take the events one at the time first and then look at their possible combined effects.
The acquisition by the Dutch company ASML Holding NV of Brion Technologies Inc. is indeed going to change the DSM market, but I believe that the consequences of the acquisition are not going to stop at the reshuffling of the existing DFM market as described in: the News
When I combine this article with the contents of the third point in Gary Smith's top 10 EDA topics for 2007 (see: the article) and add my own views of the DFM sector, some interesting developments become possible.
After the last DAC in San Francisco I remarked to Gary that I did not see any "D" in DFM, since all of the tools focused on post GDSII fixes and none provided designers with a way to avoid implementations that would require Optical Proximity Correction (OPC) tools. To which Gary replied: "Oh, you are talking about real DFM!". In the same third point mentioned before Gary states that in his opinion a couple of recently announced products from Clear Shape Technologies are starting to allow engineers to keep OPC requirements in mind during design implementation.
A few years ago, when I was still writing for EDN, I had predicted that the use of processes below 130 nm would require a tight collaboration among designers, package providers, semiconductors foundries and equipment suppliers and EDA companies. This acquisition simplifies and widens the communication channel between the maker of OPC tools and the maker of steppers that both provide the requirements and use the results of the OPC tools. So this acquisition may not be a unique event, since mask preparation will require increasing knowledge of design features and requirements as well as knowledge of optical technology and steppers capabilities if we are to follow Moore's Law.
The opportunity handed to companies such as Mentor and Synopsys is twofold. On the one hand they can address the real DFM and provide tools for designers to minimize the need for OPC modifications, and can also either sell their existing OPC tools to ASML competitors or spin-off a company that will address the post GDSII problems, the market now labeled DFM/DFY which truly has closer ties to manufacturing than to design. Both strategies are not mutually exclusive.
But I see an even more drastic change in methodology becoming possible following the merging of equipment companies and post GDSII tools. The major obstacle to increase use of Very Deep Sub-Micron (VDSM) processes is financial, not technological.
Companies with the required market and financial capabilities are now producing and selling devices fabricated at 65 nm and soon will do the same at 45 nm. But the number of such companies is small, so the semiconductor industry must find a way to increase the number of users in order to profitably use all of those 300 mm wafer fabs they have built.
What if, due to the wideband collaboration between OPC engineers and steppers engineers we could develop a library of Locality Sensitive Cells (LSC)? Then the OPC corrections would be implemented during the IC Layout phase of the implementation. It is always better to avoid a problem than to correct it. When placing a cell in a layout, the tool would pick from a large inventory in the LSC library the one that would generate the correct exposure given not only its functional behavior but also its physical behavior as required by the nature of the circuit and its immediate neighbors. The combination of true DFM and the use of the LSC library will decrease development time and increase yields thus making a large number of designs profitable when using more advanced processes.
More RTL Generation and better TLM modeling
With the acquisition of SpiraTech, Mentor gets a few notable assets. Some of the staff of SpiraTech are well known to members of Mentor's Design Verification and Test Division, since they worked together on VHDL 2002 and also explored together an object oriented version of VHDL that was never developed. Together they form a strong team of "language lawyers" that can contribute to establishing a design exploration environment worthy of the ESL label. But that is not all.
With the introduction of Catapult synthesis a couple of years ago, Mentor Graphics had set itself apart from the traditional ESL companies that tied their success to SystemC. Catapult generates an RTL netlist from C, albeit a subset of C larger than SystemC. Catapult has been well received in both Japan and Europe. SpiraTech, in the mean time, has developed its own proprietary language CY that has all of the characteristics of SystemC but adds the very important concept of interface semantics and the vitally important concept of temporal domain. These are the biggest hurdles that have kept SystemC from growing beyond the relatively small number of adopters. I can see that it will not take long before CY and Catapult get together to significantly improve the power of RTL netlist generation from C and at the same time increase the utility of C as a transaction level modeling language.
Although C is not the ideal ESL language it is very useful for prototyping and feasibility exploration. It has been limited by a lack of complete integration with the implementation world. The mixture of capable language architects, Catapult, and CY may just be the caldron required to complete the VHDL, SystemVerilog environment for system exploration and implementation.
Off to a good 2007 start
The two news item give me the hope that the EDA industry will stop focusing on implementation and verification and start truly work toward the correct by design methodology. It costs a lot to "verify" meaning "debug" designs because engineers continue to be asked to develop more complex designs in shorter time with the same methodology used a few Moore's nodes back. When I think of the possibility I almost want to get back to engineering.