When planning new IC design projects, such as SoCs or complex analog or RF chips, R&D organizations that have a firm grasp on the complexity of implementing the design wield a powerful competitive advantage. Complexity is a measure of engineering difficulty and provides the foundation for reliably estimating engineering resource requirements and development cycle time for projects, which is the essence of good project planning. Can anyone disagree that consistently reliable project plans, which means projects finish on-time and within budget, translate to higher revenue and profits? But how does one get an accurate, quantitative calculation of design complexity?
One approach (that I know works) uses "industry standard effort" as the proxy for complexity. Industry standard effort is a calculation of the amount of effort the average design team in the semiconductor industry would expend on the particular design project and is based on the design's technical specifications and characteristics. An empirically calibrated model must perform the calculation—not a theoretical model. Technical characteristics include parameters such as process technology and geometry, number of metal layers, power, circuit types such as AMS, RF, logic and memory, transistor counts, processor cores and block functions, clocking schemes and frequencies, amount of reuse, and many others. A number of top semiconductor companies use this methodology. Intel, for example, recently presented a paper entitled "Empirical Model for Organizational Dynamics" that references this approach.
Developing a reliable complexity estimation model demands extensive project data mining of the relationships between chip design parameters and the amount of engineering effort associated with implementing them. I know this firsthand, having spent over ten years working in the field. The lynchpin is having a rich database of industry projects and a mathematical model that can be readily calibrated with the industry data.
With a reliable calculation of standard effort, R&D organizations can compare the relative complexities of different designs, a very useful project planning capability. Taking it one step further, R&D managers can use the calculated complexity to determine the productivity of a finished project. It's the ratio of Industry Standard Effort to Actual Effort Expended. Measuring the productivity on a handful of finished projects provides a baseline and starting point for reliably estimating the productivity on future projects.
Why is all this useful? In short, combining complexity and productivity modeling yields a fact-based approach to generating accurate and precise staffing requirements for IC projects, and this translates to on-time delivery and business success.
Ronald Collett is president and CEO of Numetrics, which provides fact-based project planning and benchmarking software that improves IC development productivity and schedule predictability.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.