Discussion panels held in public are often dull exercises in easy harmony. But get experts in a closed room and opinions can become more divided, or at least more vocal.
When you attend conference panels, you often get everyone agreeing with each other and little real discussion about the deeper issues. But when you sit the same people down in a room together without an audience, those same discussions can become livelier.
That was the case when I got together with Gary Smith, chief analyst at Gary Smith EDA; Paul Martin, senior manager responsible for debug, trace, and performance modeling at ARM; Rajeev Ranjan, chief technology officer at Jasper Design Automation; Harry Foster, chief verification scientist at Mentor Graphics; and Varesh Paruthi, senior technical staff member at IBM.
Brian Bailey: What do you see as the current state of verification and its future?
Gary Smith: If you look at the effort currently going into the verification, it's flat. The tools we have today have kept up with Moore's Law, but if you look into the future you're scared witless...
Varesh Paruthi: We saw this great growth in terms of the number of verification engineers and the ratio really jumped up. I know that your take was we are still seeing growth, but there seems to be a flattening as well.
Harry Foster: There is flattening. It's getting pretty close to a one-to-one ratio in terms of verification engineers to design engineers. But who's to say? Maybe we had the wrong mix before; maybe it was wrong all along.
Gary Smith: The way we were designing chips then, we weren't wrong. The way we are designing chips now has more to do with verification for the final SoC -- that's where the problem has been. And that handoff has to be as clean as possible. Verification in the past was when the chief engineer blew the whistle and said we're done, and that was it. You taped out and you got what you got, and you lived with it. But we can't do that anymore.
Rajeev Ranjan: And who's to say that that's the wrong status currently? As you said, 10 years ago we were really challenged. Ten years before that we were thinking "this turn [of silicon] is not going to work." If we look today, things are working. Newer and newer parts are coming on the market. I think that verification efficiency is definitely on the rise and that we are being more productive with verification. Is there more scope for improvement? Certainly.
Varesh Paruthi: It's gone from just an ad hoc discipline, to a very defined discipline with a very clear mix of technologies, which are very mature at this point. And, certainly, pointing to the future there is enough innovation to be done to scale up and enough challenges to be conquered, but history says we'll rise to the challenge.
Brian Bailey: What scares us at this point? Have we got it solved for the next 10 years? What are the things that are happening now that scare you?
Gary Smith: What scares me the most is what we are handing off to the assembly group. We've got IP integration down fairly well -- it sort of works -- the RTL flow is robust at creating the IP blocks, and handing it off to the assembly group. That's what we've been working on for a long time. Now the assembly group is looking at this design and wondering how do you do constrained random testing on a 400 million gate design?
All: You can't. You don't. You just don't do it.
Varesh Paruthi: That's where the challenge is. The number of cycles that you can throw at the problems of tomorrow, such as end-to-end SoC verification issues -- that's the gating factor. What's next is to scale that checking across hardware and software.
Harry Foster: Assembly [of IP blocks] in the past was relatively simple, but now there's a lot more interaction between parts, particularly when you throw power on top of it, with abstract state machines across multiple domains, but the amount of verification at that integration level has sky-rocketed, and on top of that there's the software aspect.
Brian Bailey: At what point do we have to include software?
Harry Foster: Going back to power, a lot of the power controllers are actually entirely in software, or you have a hypervisor type of power controller. You have to verify it. You have no choice. It's either emulation or FPGA prototyping or some way to have the software there.
Gary Smith: And the embedded guys aren't ready for that. That's why we have that big problem with the middleware guys. We don't have enough coming out of the universities for the middleware job -- a job that is now even more complex because it has to deal with power.
To be continued...