Rick, thanks for pointing to my article, "28nm – The Last Node of Moore's Law". It should be pointed out that I did not say "Moore's Law is already dead," since there is a big difference between Moore's Law and dimensional scaling (read my blog "Are we using Moore's name in vain?"). We believe Moore's could be extended for at least a decade by scaling up using monolithic 3D. It is even better to see that the industry seems to agree. First were the memory vendors moving to monolithic 3D scaling with 3D NAND, introduced by Toshiba called BICS, and already in mass production by Samsung called V NAND. And recently we reported that the industry logic-SOC leader - "Qualcomm Calls for Monolithic 3D IC". Today we learned that Qualcomm had engaged with a major foundry -- SMIC -- to bring monolithic 3D to the market. Quoting SMIC-Qualcomm press release "Going forward, SMIC will also extend its technology offerings on 3DIC and RF front-end wafer manufacturing in support of Qualcomm Technologies."
@Rick: "exponential curves don't come along that often. This industry has had the delightful good luck of riding an exponential curve for several decades"
I think this is the real point about Moore's Law rise & fall. The scaling capabilities of the silicon transistor allowed for a long period of "easy" doubling every critical feature (power, size, speed, cost...). But a material has its own physical limits, and we started to face this in the early 2000 decade.
I cannot understand why so many people gets angry when pointing to the fact of Moore's Law running out of gas. After all, this is not a Law of Nature, but a very good ad-hoc estimation that could be applied to a specific technology (CMOS transistor).
@rick merrit Don't get me wrong, I'd love to see those things, but we have to remember that as chips get cheaper, volume has to go up to keep Moore's law alive... and once we reach worldwide smartfone saturation....
I think the short term effect we are seeing is that smart SoC design will become more important as the time between new process node introductions increases.
I spoke with one semi industry colleague who stated that the engineers who do their product process shrinks used to make more money than the SoC design team members, because the shrink team added more economic value to their company than the designers. That situation has now reversed.
There's lots of opportunity for semi designers to implement better designs using existing process nodes. In the software world, we have a concept called "refactoring" where software product code is periodically updated to make it more maintainable and extensible. There's little or no change to functionality, but improvements in code size and performance often result from the refactoring process. With the increased cost of process shrinks, it will make more sense for semi design teams to refactor their RTL and then extend it to get better power consumption, performance or integration with new features.
A big part of this process will be removing the legacy "cruft" that exists in modern SoCs. Your smartphone may have been built in 2014, but I will bet you $10 that there are some transistors in there that were synthesized from RTL that was designed in the 1990s. (Especially in your digital baseband modem!)
The standard operating procedure use by semis has traditionally been, "Once it's verified, don't change it unless there's a change to functional requirements." This will change: Removing the cruft and re-architecting a SoC to get small die area, better performance and lower power consumption will cost less than doing a process shrink.
alex: I agree there are plenty applications that could benefit from a continued Moore's law. My question was, can the sum market potential of these applications finance the continued investments required to keep up the pace?
Historically we have bought faster pc's every x years, thus keeping the pc processor volume up, thus enabling the continued investment from e.g. Intel into the massive research and fab equipment required to support Moore's law.
Moore's law has been self fulfilling because of two statements:
a) the engineers have said - If the market will buy enough chips of the next node, we can afford making the jump.
b) The customer has said - you make it so much faster/cheaper/smaller, we will buy it - and we will buy more than the previous one, because there are more applications for it.
Once one of these statements fail, Moore's law fails.
Chip improvement will not stop, but it will slow down.
Ole: there are plenty of future applications which could use faster and cheaper processing : personal virtual reality(which is clearly limited by processing power today) and augmented reality,industrial and personal robots , true artificial intelligence - just ask your scifi author for applications. And let's not forget on all the research we can do with cheaper compute power.
Take all of those , and think how you can deploy to them everywhere, including china and africa.
The other question, walking hand in hand with "is Moore's law dead" is:
Does the market _need_ Moore's law? Moore's law has been fulfilled because the market has demanded ever faster,cheaper,smaller, products with lower power cunsumption.
But: What products will require continuos exponential improvement from semiconductors?
(in layman's terms - "do we need faster computers?").
Maybe high-speed communications/network will benefit, but they could probably find other ways (optics?) if needed. Anyway, this is not high enough volume to drive this development. (neither is monster consumer graphics cards)
Storage; yes, if we are to dump spinning disks, solid state needs to be cheaper, but will market demand be high enough to drive Moore's law onwards?
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.