During the next five years, a great many semiconductor companies will be faced with an increasing number of underperforming business units. Chances are they'll be selling or spinning them off. Some chip companies, large and small, will disappear altogether. Why?
Persistently missed product development schedules—poor schedule predictability—is the culprit and bane of semiconductor companies. It is a pervasive problem in an industry whose R&D stakes are now excruciatingly high. Typical SoC development costs, for example, range from $50 million to $100 million (from design concept to release-to-production). With a target return of 5X to 10X and a narrow market window, a mere three-month slip can dramatically reduce revenue and profitability. No surprise.
Projects miss schedule when management underestimates or fails to acknowledge the time and resources the R&D organization needs to develop complex ICs. Absence of a reliable estimation process is the crux of the problem. Curiously, many semiconductor executives have been blind to the issue, and yet it is the key failure mechanism of their businesses because it goes to the heart of their product development engine. Go figure.
Accurately calculating design difficulty—the prerequisite to reliably estimating resources and schedules—demands quantitative methods that measure the intrinsic difficulty of designing an ICs logic, circuitry, packaging, etc. However, it must also take into account the sizable stochastic footprint of chip development.
Some executives fail to recognize the stochastic aspect of IC development has a far greater impact than ever before—a consequence of soaring development costs, inexorable competition and the limited size of each market opportunity. Resource and schedule planning therefore demands a stochastic model, which contemplates events that projects routinely encounter. Examples include spec changes, EDA tool issues, IP quality, project management, organizational issues, etc. It's not exceptionally hard to model the development process, but it does require a good deal of industry data.
Project and program managers typically rely on experience and intuition to create project plans. Not a bad start, but the approach is light on facts and data and heavy on heuristics and hope. IC development has both a deterministic and stochastic component, and they are inextricably intertwined. Most project plans contemplate only the deterministic aspect of complexity, which is why they are flawed from the start. Not addressing the "stochastic problem" will almost certainly translate into reduced shareholder value and lost jobs.
Ronald Collett is president and CEO of Numetrics Management Systems, Inc. www.numetrics.com
My observation is that giving a team an unrealistic schedule is usually counter-productive b/c it is a demotivator. Far better is to give a team a very aggressive -- but achievable -- schedule. When I say "very aggressive," I mean a schedule that assumes the development team's prodcutivity will be best-in-class. In other words, in order to achieve the target schedule, the team must perform at or near best-in-class. The result (based on tracking this phenomenon on hundreds of projects across the industry) is that the productivity of those teams are far above norm -- and the projects are are on schedule (slip is less than 10%) and come in within tolerable budget margins.
Thanks for your comment.
There are definitely companies making appropriate up-front tradeoffs -- Intel for example. Here is an abstract of a paper presented by Intel at an industry conference held earlier this year devoted to project planning strategies:
"Effective planning for complex design projects requires understanding the tradeoffs between resources, complexity, schedule, and risk. Resource decisions are based on a particular project's priorities, which may impact team productivity [because team size might need to be increased to hit target schedules, and increased team size will reduce productivity -- but increase throughput], and hence are important to comprehend in any systematic approach to benchmarking or planning. An empierial model for silicon design is built from available data: both from internal and external sources. This mdoel can be used to understand reslationships between team size and productivity, complexity and duration, risk and performance to schedule. Use of the model to predict organizational limits for complexity is discussed along with how key trends like reuse and modularity are unavoidable responses to current trends. This model is used throughout Intel to help make silicon design resourcing decisions . . . ."
In light of the above, perhaps it's not surprising that Intel's financial performance continues to be exceptionally strong.
I could only dream to be on ANY project that was on time, on budget, and up to full performance. I (when I was young and naive) would give realistic schedule estimates to my boss; he would double or triple it and give it the the VP; the VP would talk with marketing and come back with a schedule that was 75 to 80% of my original one. There is always pressure to shorten the proposed schedules either by the boss or higher ups. What usually happens is the engineering staff gets hammered and is blamed for the slipping schedules. NO-ONE remembers the original schedule, its just a fact. Until companies realize that the designs will take just so much time / resources and make the appropriate trade-offs UPFRONT they are bound to lose market share or fail. I might suggest a sliding schedule analysis that provides the cost verses time to market trade-offs balancing shorter/more costly development with earlier/more profitable releases. Anyone know a company doing this?
Ronald, you write "Projects miss schedule when management underestimates or fails to acknowledge the time and resources the R&D organization needs to develop complex ICs" but my experience on many IC projects has been that all projects miss schedule because they are designed to do so! As one executive explained to me, "if I make a realistic schedule we will be late to market by one year, but if I shorten it unrealistically we might be only 6 months late"...has anyone worked on an IC design project that met teh deadline and was on budget??? Kris
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.