When the number of process steps has to double suddenly, it may no longer make economical sense to use 30% shrink to advance to the next node. It may have to be at least 35% shrink; the max of course is 50%. So 28 nm should be followed by 18 nm instead of 20 nm, for example.
Of course brute force scaling is not the only way forward. We still have all manner of "more than Moore" ways to improve microelectronic devices. And the end of Moores Law doesn't mean a total end to scaling, just a slowing of the rate. That next node may now take a few years instead of just 18 months.
One argument for it all coming to an end is that FinFETs get us from 20nm which was the end of the line for planar to 5nm before it too breaks down and there is no new switch even on the drawing board. It took FinFETs 20 years from initial stidies to first manufacture. That could mean that it will take 20 or more years to get beyond 5nm.
I find it hard to accept the DARPA person's opinion that US national security is threatened if Moore's law comes to an end. There is plenty more to do beyond scaling that have not received the same attention! Some one already commented about "More Than Moore" and there are quite a few challenges remaining in circuit boards and substrates.
I'm the "DARPA person" in question. They pay us to be paranoid about national defense, and there really are people in the world who do not wish the United States well. (I hope that doesn't surprise anyone here.) So keep that bias in mind.
But I do think there's a real issue here, and it's the main reason I finally gave in and decided to perform some government service for the past couple of years. The issue is that for several decades, if you wanted to field military electronics, you developed it at great cost, but when it was complete, only peer nation states could afford to do likewise. Nowadays, commercial off the shelf electronics are very high performance, readily available, and inexpensive. So many more players besides peer nation states can make electronics with military implications. We still do develop electronics beyond COTS for U.S. military purposes, but there are times when using COTS is just the best we or anyone else can do.
When Moore's Law finally grinds to a halt, further advances in COTS will continue but at a far slower rate. Yes, there is low hanging fruit in SW, algorithms, 3D stacking, specialized processors, and other items I mentioned in my talk. But you cannot sustainably combine lots of onesies and replace an underlying exponential. One of the motivating ideas for my talk was that the U.S. must plan for the end of Moore's Law as though it will cause all players, not just peer nation states, to end up with the same HW capabilities. That would drastically reduce some of the advantages the U.S. has long enjoyed in certain militarily relevant arenas. -Bob Colwell
That is a really scary thought that I'd never considered before. It's especially sobering because although I am sure people will argue about when, I think we can all agree that eventually Moore's Law will run out of steam.
But Mr. Colwell, I would ask: is it Moore's Law that needs to continue, or just scaling? I know they are largely considered the same thing, but I see a subtle difference. Because I know that process engineers and technologists can continue scaling to smaller nodes well beyond where we are today. But the question is whether they can do it economically. If not, it wouldn't make sense for continued mass production of chips for use in smartphones, tablets, PCs, etc. But if its a question of producing a limited number of chips for national security purposes, wouldn't the government want to continue funding that, even at great expense?
If Moore's law is coming to an end with the current technology, it is high time that some different out-of-box technology has to evolve e.g. nano technology or using the molecular biology to build the future circuits
In the last few decades, no doubt processing power has a direct relation to number of transistors. The time may have changed by now. Multi-cores design and parallel processing may have a greater effect today.
Blog Make a Frequency Plan Tom Burke 17 comments When designing a printed circuit board, you should develop a frequency plan, something that can be easily overlooked. A frequency plan should be one of your first steps ...