Recently, the Wall Street Journal (pg 1 June 18, 2010) highlighted the pressure placed by increasing commodity costs and decreasing consumer prices on profit margins. This isn’t anything new to the semiconductor industry.
Commodity costs for critical semiconductor materials, like photomasks or gold, have increased from time to time and yet the industry continues to reduce the cost per transistor by more than 30% per year.
Right now, shortages abound, so it seems premature to worry about costs when customers are demanding far more units than can be manufactured. But the supply/demand imbalance that currently dominates the industry will dissipate. What happens then?
As always, the input costs must continue to contribute their share of the cost reduction for semiconductors or the long term “learning curve” will break down. If gold fails to hold up its part of the cost reduction, other input materials must decrease in cost enough to offset it; or the industry will find ways to use less gold. When a portion of the input costs can’t keep up, the semiconductor industry has cleverly found alternatives that can.
But what about irreplaceable costs, like semiconductor manufacturing equipment or the total cost of design? Can these costs keep up with the aggressive learning curve of the semiconductor industry? Of course they can, as they always have.
Average cost of a 200mm equivalent bulk CMOS wafer from a foundry has decreased 22% in the last decade despite the increasing complexity of manufacturing that smaller design rules cause (Figure 1).
Figure 1. Foundry revenue per wafer (200mm equivalent)
The real measure of a learning curve is cost per transistor. The semiconductor industry has averaged more than 30% per year cost reduction over the last 50 years. As seen in Figure 2, that reduction in cost per transistor continues as we grow the volume of transistors shipped every year. Input costs, semiconductor manufacturing costs, materials, labor, design, etc., contribute a share to make this happen.
Figure 2. 1985-2007 IC learning curve that has enabled the semiconductor industry to deliver an average of more than 30% per year reduction in costs
What about the cost of design? Dire predictions of rapidly increasing design costs have been published in the International Technology Roadmap for Semiconductors for years. VLSI Research has calculated the number of transistors shipped per year (historically, the SIA has also calculated these numbers) and the Electronic Design Automation Consortium (EDAC) tallies the total sales of EDA design software including support. By dividing the EDA license and support TAM (total available market) by the number of transistors shipped, we can plot the EDA software cost per transistor.
SURPRISE: EDA cost per transistor is coming down the same learning curve as all the other input costs like materials, chemicals, labor, etc. (Figure 3) and it has been doing so throughout semiconductor history.
Figure 3. EDA costs per transistor (in blue) are dropping at the same rate as total IC revenue per transistor
As a result, the EDA TAM has remained a nearly constant percentage of the semiconductor TAM (Figure 4) in recent history at 2%.
Figure 4. EDA
revenue has averaged 2% of IC revenue the past 15 years.
Wally - I am not able to extrapolate from the charts what is the ratio of EDA tool cost vs.company revenues over the years. I understand that overall cost is reducing per transistor but as EDA companies are segmenting their features into different licenses, the EDA tool cost might be increasing with respect total revenues of companies. Any insight on this will help.
The Gary Smith EDA – Proposed ITRS Cost Chart 2010 shows the combination of “Embedded Software Automation” tool costs and total software engineering costs in the blue part of Figure 5. Some of the system analysis costs are included in the EDA tool and engineering expense but it is undoubtedly small since the electronic system level automation tool part of the EDA TAM is only about 5% and that includes high level synthesis. The ESL panel at DAC this year delivered the impression that ESL analysis tools have finally come into their own and are being used for significant amounts of electronic system level design and analysis for chips. That said, you are certainly right that this is an area of opportunity for EDA, since much (most?) of the high level architectural analysis is still done by system engineers using their own proprietary tools. For embedded software, commercial tools predominate but virtual prototyping for hardware/software coverification is still (after 15 years of EDA sales) only about 1% of the EDA license and maintenance TAM.
For data consistency, I’ve included only EDA license and maintenance revenue in the learning curve example. EDAC began reporting IP sales data, in its current form, starting in 2005 but there is no comparable history to use in an analysis. IP sales are certainly a growing part of the EDA market and a growing part of the cost of designs. Last year, they represented nearly $1 billion, if you include IP from companies like ARM, and over the last five years they would have increased “EDA costs” from 2 to about 2.5% of semiconductor revenue. On the other hand, most chip designs are derivatives of existing designs and incorporate lots of IP from previous designs done by the same company; this reuse cost has been ignored in the historical data. The difference now is primarily a “make vs buy” decision to reuse IP from other companies because, at least in some cases, it is less expensive than creating it yourself.
Very interesting article. I wonder what's the share of the embedded software development costs and system analysis costs separately? Also, the article assumes that the system analysis tools do actually deliver what the customer wants. I think we are still far away from this, and end customers still have to do a lot of system analysis and embedded software development on their own.
I'd be interested in seeing some of the other project costs included too. For example, how about IP? Seems like EDA companies are betting on IP being a growing segemnt with recent purchases (Denali, Virage, etc).
An engineer who has experienced firsthand the changes that the engineering profession has undergone since the days of Bill Hewlett and David Packard argues that the loss of innovative capacity is the direct result of a vacuum in American business thought leadership.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.