Is IC design productivity rising or falling? It’s a question on the minds of semiconductor executives and R&D managers throughout the industry. The answer depends on whether we view it in absolute versus relative terms. Both have merit. In absolute terms it’s rising, but in relative terms it’s falling.
A “relative” measurement compares changes in productivity to changes in design complexity: How much is productivity increasing compared to the increase in design complexity? Through that lens, productivity is falling, and recently the decline has become steeper. How do I know? Aside from rigorously measuring it for more than 10 years, I know that design team sizes have been steadily increasing – the facts and data irrefutably confirm it. That means productivity isn’t keeping pace with rising design complexity. The “escape hatch” solution has been to increase design team size – throw more engineers at the problem. Alternatively, if productivity was keeping pace or increasing, average team size would be flat or declining, respectively.
Of course, in absolute terms, productivity is increasing: Productivity this year is higher than last year, and last year it was higher than the previous year, and so on. That’s also irrefutable – again, based on the facts and data. Consider the effort required to design a million-transistor SoC ten years ago versus what it takes today. No comparison – teams expend much less effort today than they did then.
Absolute year-over-year productivity improvement (or decline) is critically important to an R&D organization’s productivity improvement initiative, but it is not important from an industry-level standpoint. Rather, the relevant concern there is whether productivity is keeping pace with combination of three inextricably intertwined forces: increasing design complexity, time-to-market pressure and global competition.
Declining “relative-productivity” is occurring even in the face of more design reuse, better EDA tools, new methodologies, etc. No doubt these things are boosting “absolute productivity,” but they aren’t enough to keep engineering managers and executives from continuously boosting team size. What will be the impact on the semiconductor industry?
Ronald Collett is president and CEO of Numetrics, which provides fact-based project planning and benchmarking software that improves IC development productivity and schedule predictability.
Ron, the key is the metric here...sure it takes more people to design an IC today, 100+ designers is not unheard of...but each designs lots of transistors or gates, much more than in the past...so in terms of the first metric productivity drops, in terms of the second it increases...Kris
Yes, exactly -- relative productivity continues to decline, whereas absolute productivity continues to rise. It's an inconvenient truth that executive management needs to face. Thanks for your comment. Ron
I would argue the main reason for low productivity is the lack if innovation in EDA industry. IC designers have to deal with many point tools by different vendors and some of them dont even work well with others. If you design an analog circuit in once process node, you have completely redesign the circuit & layout to make it work in another process node. Basically I feel the EDA tool should be able to mask all the physical design challenges that come up in lower process nodes from the circuit/logic designers to improve the productivity.
I agree that there needs to be better tools for mixed signal design. The EDA vendors have done a good job on the tools for pure digital design since that is the easier case. The design community has to work with the EDA companies to get tools for analog and mixed signal. The problem is that the standards need to be put in place first so that the tools can benefit everyone. Most companies see their design flow as a strategic advantage so they don't have the desire for standardization.
I agree completely about the difficulty of getting standards in place. Tough to make happen when consensus is needed -- kind of like trying to get agreement a the United Nations. Lots of posturing, spin and lip service, which of course is not at all surprising. Much easier when a de facto standard arises, typically driven by a company with significant market power that has introduced a technology that demonstrates clear value-add. Thanks for your comment. Ron
You may very well be right, but I don't see more innovation than we've already seen coming out of the EDA industry. The market isn't growing much and the competition in the EDA industry is stiff, which means that the risk-return equation for venture capital funding is out of balance. The consequence is less investment in start-ups. So the innovation must come from larger EDA companies, which is always spotty. Thanks for your comment. Ron
When I refer to design productivity, the term "design" includes verification, as well as design (digital, analog, RF, etc.), layout, test, validation, qualification, etc. So my use of the term "design productivity" is a misnomer. (My bad.) It's really IC Development Productivity, which is the term our tools use to report the productivity metric. Development Productivity measures the aggregrate productivity of an entire IC development project, from start-of-concept milestone to release-to-production milestone, and includes every activity occuring during that interval (except SW development, which has its own productivity metric -- SW Development Productivity).
But to your point, it is indeed true that verification productivity is not keeping pace with verification complexity. The evidence is in the size of verification teams -- they continues to grow, which means that verif. productivity is not keeping pace with verification complexity. In addition, verif. team size is growing non-linearly with other design activities on the project, which reflects that indeed it is becoming a more significant "bottleneck." Thanks for your comment. Ron
Much of the stagnation can be attributed to the toolsets. Yes, there have been major strides in tool development, but mostly in the area of broad integration and backend support, and not at the level of helping to understand the code itself. (Don't get me wrong -- some of the backend pieces are absolutely amazing!) At the front end, where the coder sits, there has been little effort at improving code entry. At both places where I worked with VHDL/Verilog, the code was edited by standard text editors. In my position as reviewer, I find myself sorting through mulitple vi windows tracking this or that variable. I don't see why syntax-aware and otherwise helpful editors (like from www.scitools.com), or even compile-on-the-fly (like Visual Studio), are not more welcome in the hardware domain.
Coming from a software background (mostly C and C#), my impression of VHDL/Verilog is a bunch of ill-organized and poorly factored modules, and coding style/methods based as much on tradition and superstition as the worst software I've ever seen. Modules with 50 inputs are routinely accepted, and no one thinks this is hindering productivity? We won't even get to ideas like information hiding and other advanced concepts.
The languages require a schizophrenic combination of brute force and "oh, the compiler will optimize that away" thinking. Those coders that use the advanced features of the languages are clearly in the minority. Does it strike anyone else as odd that the simulator and synthesizer use different compiler _front_ ends? Syntax that works in one throws warnings and errors in the other! Duh!
Perhaps my impressions are wrong: my sample set is exactly '2' employers. But I see much the same on the comment boards.
I tend to disagree. Verilog/VHDL though modelled on the traditional software languages like C, were meant for Hardware description and there is only a level of abstraction in any language before it stops serving its actual purpose. I think, even software engineers would agree with this!
VHDL was modeled from Ada, and Verilog seems to be some combination of C and Cobol. I did not say that these languages are at the end of their usefulness (although some tweaks would be helpful), but proper tools would enhance the productivity of using them.
Why are there no compilers that work in real-time, as the code is entered? For modern C, C#, Java and other systems, all syntax is checked and errors displayed directly in the editor. Cross references can be checked, so that an "input" to a module is instantly tracked to its source in the calling module. This greatly reduces the number of fundamental errors, and can ensure consistent naming.
Another example would be that the editor should recognize that a (Verilog) "wire" declaration is needed that corresponds to an "assign", and simply insert it, or a proper "reg" declaration when the assignment happens inside an "always". ....and delete them when necessary as well.
Multiple files in a design should be displayed in a tree or network format. "dir" or "ls" just don't lend themselves to identifying dominant and subordinate files in a hierarchical design, and especially won't help in culling out just the call strings and linkage information. This is currently a form of tribal knowledge in the design team.
The fundamental problem here is that software developers make their own tools. Hardware developers are stuck with vi or gvim or (Heaven help us) edit, and don't know what they are missing. You have a billion-instruction per second machine there (thanks to the hardware guys that came before us), and it spends its day waiting for keystrokes (sort of reminiscent of Marvin the robot). Incremental simulation and synthesis should be a research topic at all the major vendors.
@antiquus -- There are more tools available to help out at the RTL level than just text editors feeding into simulators & logic synthesis tools. There are decent linting tools, for example, that will warn you of all sorts of bad coding practices, and will also let you graphically view your hierarchy and track objects from one level to another.
I like your idea of an editor that would automatically add the mandatory (and mundane) stuff like wire & reg declarations. But you should be aware that even for old standby editors like vim, there are add-ons for Verilog syntax awareness -- nothing too fancy, but at least they display keywords in one color, numbers in another color, operators in yet another and so on.
Not nearly up to the level of software development tools, but not entirely stuck in the Stone Age either.
And for real productivity gains in hardware design, certain types of designs lend themselves nicely to ESL tools -- even some tools in which the design entry method is drawing block diagrams rather than typing code in a text editor.
A certain woman is a diabetic. She is now skilled at working with syringes, and finding new places to give herself those shots. She is very grateful for the science that keeps her alive, and is diligent to follow the rules.
Then science (an art) became technology (an engineering discipline), and she received an implanted insulin pump.
Now, at restaurants, she pulls out her wireless (yes, that's W I R E L E S S) remote pump controller, and uses it to prick her finger. She taps in chicken, potatoes, and ice cream, and the calculations are done. No more charts, no more disposing of needles, no more remembering to just do it.
And you have colored syntax and lint!!?!!
@RonCollett : we found your lost productivity!! Its spinning counterclockwise in the northern hemisphere!
Don't forget that VHDL and Verilog are languages that are used to describe physical logic. Many of the techniques used by logic designers help prevent problems in the physical logic. Good software coding can cause problems when it is synthesized into logic. You may need a block in your chip that has 50 inputs, so that is okay.
It does not strike me as odd that the synthesizer and the simulator have different compilers since they are used for two different objectives. Many things work fine is simulation that can not be easily implemented in the silicon. That is the designers job.
So you are saying that your early design efforts can run down a rabbit hole, leaving the synthesis step to materially start over? Reliance on the designer's wisdom in this matter does not allow for scalable productivity. How many iterations do young designers require before they learn these quirks? Getting the bugs out during synthesis and validation when they should have been gone during simulation and verification is certainly a hindrance to better productivity.
ahh, you can't even buy ICs any more.
We almost ALWAYS have to make changes because even 'standard' components are not available.
What a joke.
all these new fantastic 'press releases' for some pie-in-the-sky chip, but you can't hardly find ANY inventory any more.
WHAT A JOKE.
Time to redesign using basic discrete components.
take your 'fancy' chip sets and go fly a kite !!!
Join our online Radio Show on Friday 11th July starting at 2:00pm Eastern, when EETimes editor of all things fun and interesting, Max Maxfield, and embedded systems expert, Jack Ganssle, will debate as to just what is, and is not, and embedded system.