One thing that everybody wants to know, always asks about, is what was the theme of DAC. Many years I don't think it really has one, but this year that theme was obvious, at least at the level of keynotes, panels and high-level messaging. That message is power, or perhaps more precisely, the need to be concerned about power. The whole conference started off with the Accellera breakfast that concentrated on this subject, and more about that in a minute. In Wally Rhines vision talk, he said that power must be reduced by two orders of magnitude. Following that in the keynote, Gregg Lowe of Freescale said that the energy per bit had to be reduced a million times. Bob Colwell of DARPA perhaps had the direst message when it comes to power. He said in his SKY talk that power will bring down Moore's Law and he is warning the government about the implications of this, both at the economic level and at the national security level. Many other talks touched on the importance of power.
So why is this suddenly such a big issue? It is not a new problem and has been affecting nodes since about 90nm. The reason is that most designs are now moving to 90nm and so it is becoming a mainstream problem, and that makes economic since to the EDA industry and is an indication of where they expect to make some big money in the coming years.
So, back to the Accellera breakfast. The panel was composed of (in order left to right in the photo) Jeffrey Lee – Synopsys, Qi Wang – Cadence, Sushma Honnavara-Prasad – Broadcom, John Biggs - ARM and chair of IEEE 1801 Working Group and Erich Marschner - Mentor Graphics. It was moderated by Ed Sperling of System-Level Design.
Sushma noted that power is a system design problem, not just an implementation problem. This puts a lot of stress of the team. Power is intrusive and affects every aspect of the flow including software. We are seeing 20-30 and maybe even hundreds of power domains in a large chip and the control between them is getting complex. This requires a robust verification methodology. Jeffrey agreed saying that small companies have to get the methodology right so that they can do this without a large CAD team in place.
It was noted that while power is the focus of most people, power creates thermal issues and these have to be dealt with as well.
Erich pointed out that if you leave verification until the end is too slow and you can only see small issues. Today it is at RTL, but tomorrow it has to move earlier in the flow. We are beginning to see the transition of those companies that treat power as a back end part of the process to those who develop power intent at the RTL level and feed that into the process.
Software people have to be informed about power. They cannot do this after the how has been finished, they have to get involved early on. This requires HW/SW co-verification. Often this requires careful definition of use-cases. The existing standards do not really deal with this. It is not just about leakage, it is about dynamic power, how to manage the cores.
Power estimation is one of the areas that will be tackled with future versions of the UPF standard. This requires accurate power numbers for the components. How do you make power intent be physically aware? The whole point of UPF is to capture data that will provide help to the back end. It is about behaviors rather than the way in which something will be implemented. UPF is a constraint, not an implementation.Brian Bailey
– keeping you covered
If you found this article to be of interest, visit EDA Designline
where you will find the latest and greatest design, technology, product, and news articles with regard to all aspects of Electronic Design Automation (EDA).
Also, you can obtain a highlights update delivered directly to your inbox by signing up for the EDA Designline weekly newsletter – just Click Here
to request this newsletter using the Manage Newsletters tab (if you aren't already a member you'll be asked to register, but it's free and painless so don't let that stop you).