Editor's note: This is the first of a two part opinion piece authored by EDA luminaries Jim Hogan and Paul McLellan. Introduction In nature, long periods of relatively stable environments are occasionally punctuated by large-scale changes that are the catalyst for evolution to create a large variety of mutations, and then for natural selection to weed out the unsuccessful ones.
The environment in which design methodology lives is also characterized by periods of relative stability punctuated by discontinuous change when the march of process nodes means that insignificant issues are suddenly major problems and when the scale of designs breaks the old methodologies. New approaches abound and, as in nature, the successful ones live on and others fall by the wayside.
Unlike in nature, however, these discontinuities are not rare and seem to come along roughly every ten years. We seem to be at another of these discontinuities today. Lesson in nature Let us consider our genus Homo and the family Hominid that we belong to. At one time in Africa, there were at least three different species of Homo genus living at the same time. In Europe, up until the last Ice Age 50,000 years ago, there were two living in Europe: warm weather adapted Homo Sapiens (modern humans) and cold weather adapted Homo Neanderthalensis. Today, there is only one member of the Hominoid family left on earth—us. We won the genetic lottery.
So it goes with business evolution—except that it moves much faster. The semiconductor ecosystem, in particular, shifts its value aggregation points on a somewhat predictable time scale, followed by longer period of stability in technology and business models during which new companies are created.
These regular changes to the way electronic systems are realized usually take the form of two complementary changes. The first is a change in the way that the supply chain for semiconductors is partitioned and where the value is realized. The second is a change to EDA tools and design methodologies. Typically, the EDA change is a mixture of driving up the abstraction level to cope with increasing complexity due to Moore’s law, driving down into physical effects that have become significant, and the invention of new algorithms so that the design productivity is maintained. Eventually, de facto standards emerge that everyone can work with. The era of IDMs Prior to the early 1980s, semiconductor design was completely within specialist semiconductor manufacturers that we then just called semiconductor companies but that we now call IDMs (integrated device manufacturers). Design tools were primitive: circuit simulation and polygon-based layout. Design was done at the transistor and polygon level. The value was almost entirely realized by the IDM. But the knowledge about how to do semiconductor design was leaking out into academia and research labs and would set the stage for the next transition.
I would say I'm mostly in agreement with you Jim. However I would also say that programming multicore/AMP/embedded machines requires a break with traditional programming methods and a move to asynchronous-FSM methods which to a large extent will look very similar to hardware design (at a level above RTL). So hardware and software design will (IMO) merge into a single methodology (at least for digital design), and the open-source tools that are used in the software development world will invade what is currently known as ESL in the EDA world.
The awkward part of the equation is all that analog stuff - power-management, RF, precision measurement etc. - that hasn't been integrated into the SoC flows yet, and doesn't look anything like software. It doesn't really have much support in the plug-and-play world of re-uasble IP.
Nice replay of the IC industry! I wonder how the untold part of system design/integration will evolve going forward. Testing features/functions (whether implemented in hardware or software)is a significant value add to realizing a finished product ready for shipment. The EDA industry and the systems houses struggle to speed up the process and provide enough coverage with the increasing complexity of devices that comprise systems today.
As systems become more complex the typical hardware design engineer becomes more of an 'integrator' than a designer. They will piece together the various components and IP to create the base system. If software is where the 'differentiation' comes from the hardware becomes less important and standard platforms should emerge with the mix of components needed for key applications. The processor, DSP and FPGA elements will then just be used to run the software.
Great summary of IC industry evolution! Crisp and clear. We need more posts like this, looking forward to second part.
Jim: I am editing a book on ICs. Would you be interested in expanding your text to small book chapter? firstname.lastname@example.org
Thanks Hogan for such a comprehensive article clearly descrbing the shift in the business of semiconductor from the evolution of the technology. Now the majority of the firms are concentrating to come out with the IP that can be building block of the future semiconductor products. I vision a future where many companies will become fabless who will integrate the available IP's and verify, the manufacturing of semiconductor will become more standardised and the manufacturing companies like TSMC, Global Foundaries will serve the needs of the fabless companies.
Join our online Radio Show on Friday 11th July starting at 2:00pm Eastern, when EETimes editor of all things fun and interesting, Max Maxfield, and embedded systems expert, Jack Ganssle, will debate as to just what is, and is not, and embedded system.