I expect that when we run out of steam on the current Moore's Law march with CMOS, that there will be an industry transition to Quantum Dots. Unlike Quantum Computing, Quantum Dots are not probablistic computing, and can be done at the single atom level. Not sure that it will buy you much more than 10 or 20 years after CMOS.
The other thing to keep in mind is that Moore's Law is not a Law. We are constantly reversing cause and effect here. CMOS has continued to scale exponentially, because the Semiconductor Industry managed their investments in scaling and drove their behavior to acheive those rates and not the other way around.
Finally, I would posit that with or without continued Moore's Law progress on Electronic Technology, there is already enough COTS HW out there that a sufficient technological lead doesn't matter any more. The world is already fighting that lead quite effectively with computer viruses, cyber attacks, IEDs, etc. that peer nations are already at such an asymetric disadvantage as for it not to matter any more.
Back in March 2013, EET had an article regarding Moore's Law and I will partially quote here:
"Recently researchers at Massachusetts Institute of Technology (MIT) compared the accuracy of each competing law in both its short- and long-term predictions. MIT claims their findings will improve the accuracy of future predictions about technological change, candidate technologies and policies for global change. Overall the best long-term predictor is Wright's Law, which improves its accuracy over Moore's law by framing its horizon in terms of units-of-production instead of absolute time. For instance, Moore's Law predicts that every 18 months the density of semiconductors will double, whereas Wright's Law predicts that as the number of units manufactured increases the cost-of-production decreases (no matter how long that might take). Thus Wright's Law--named after aeronautical engineer, Theodore "T.P." Wright--offers more accurate long-term predictions since it automatically adapts to economic growth rates. Growth of prediction errors for competing laws to Moore's Law shows Wright's Law the best at long-time horizons, Goddard's Law as the worse at short time horizons, and Sinclair-Klepper-Cohen the worst for long-time horizons. Wright's and other alternatives, such as Goddard's (which postulates that progress is driven only by economies of scale) and Sinclair-Klepper-Cohen's (which combines Wright's and Goddard's), were compared to the actual cost and production units in 62 different technologies, including computers, communications systems, solar cells, aircraft and automobiles. Historical data allowed accurate comparisons using "hind-casting" whereby a statistical model was developed to rank the performance of each postulated law over time. MIT claims its results show that with careful use of historical data, future technological progress is forecastable with a typical accuracy of about 2.5 percent per year. The research was conducted by MIT professor Jessika Trancik, professor Bela Nagy at the Santa Fe Institute, professor Doyne Farmer at the University of Oxford and and professor Quan Bui at St. JohnÕs College (Santa Fe, N.M.)." Pasted from <http://www.eetimes.com/electronics-news/4408525/Moore-s-Law-trumped-by-Wright-s-Law?cid=Newsletter+-+EETimes+Daily>
I'm paranoid about national defense, too, but I don't see Moore's Law as the issue.
Moore's Law simply states that the number of transistors on integrated circuits doubles approximately every two years. At some point, we'll run into hard limits imposed by the laws of physics on how small a transistor can be, and there's some evidence we are approaching that limit. (We are arguably reaching a point where while it might be theoretically possible to shrink circuits smaller, in practice, you can't afford to do it.)
One of the developments of semiconductor electronics over the past few decades is commoditization. Circuitry gets smaller, faster, and cheaper. What used to require expensive proprietary hardware can now be done with off the shelf components. This affects all areas where such things are used, including national security and defence.
But the issue has always been less about what hardware you had than what you did with it. The race isn't hardware, it's applications. When the HW playing field is level and everyone has the same gear, you win by making smarter use of it. Moore's Law has nothing to do with that.
(I am curious about what areas you see where further scaling and component shrinkage might be critical. What sort of gear used in security and defence needs to be smaller to confer an advantage?)
I would respectfully disagree with Mr. Kurzweil on that. And I have a suggestion -- when somebody says "quantum computer", you should auto-translate that into "quantum accelerator." You would not like a computer that was truly and only based on quantum principles, because such machines are probabilistic. Would you really want to edit a document, save it to permanent storage, and then get a probabilistic version of it back tomorrow? Would you like your bank to keep track of your account balances that way? Me neither. There are a lot of computing tasks that just aren't appropriate for quantum technology, not now, and quite possibly not ever. So I do not foresee a wholesale transition to some sort of quantum-based technology.
Mr. Kurzweil is basing his prognostications on humanity's long term overall cleverness, and I would agree that our history is quite amazing and something one should not readily bet against. But all industries and technologies can stall for considerable periods of time. After 100+ years, most of us still get around in metal boxes propelled by exploding hydrocarbons. It's not crazy to contemplate our industry doing that, and I would claim it's imperative for the Dept of Defense to think about it.
Very nice lecture. I think if you check the Lg for transistors in the last 2-3 years, there hasn't been much shrinkage below 25 nm. So maybe we are already at the limit, practically. But while that dimension might be frozen at some point, the other dimension still allows some play to increase density. But we'll probably hit that limit soon even by wrapping the gate around the fin. 10 nm size is possibly too far-fetched. It's probably ~3X OT? And there's source-drain tunneling.
But they are implementing double patterning a little earlier than this limit and complaining the costs. For example, maybe they want to go from 60 nm half-pitch to 42 nm half-pitch and this requires double patterning. If it's double the cost, the cost per component doesn't change. So they need to go from 60 to maybe 39 nm, so at least they have some cost reduction.
At 22nm, there are somewhere between 50 and 100 silicon atoms (depending on how you measure) between the source and the drain.
Moore's law, for conventional silicon, will end when that distance shrinks such that quantum mechanics takes over. At that point, we'll either need a semiconducting material with smaller atoms, or another approach.
Moore's Law is not dead, but it is noticeably aging, slowing down and getting cranky and hard to deal with.
In my one interview with Gordon Moore a decade ago I asked him if CMOS scaling would end. He said it would as we approach transistors the size of a few atoms (which we are doing now). He said advances would slow and get more expensive before this happened (as they are now).
Yes, there are advances in architecture (especially 3-D ICs) yet to come. And who knows maybe someone will come up with a new device platform (graphene?) that holds the promise of several decades of exponential improvements.
But for now Moore's Law is clearly slowing down and an end is in sight in perhaps 10-20 years.
People smarter than I am see it, such as Henry Samueli, founder of Broadcom and a former EE professor, who is out talking to his customers about it. He is out talking to his customers about it.
This is not a subject of debate. It is a reality smart people are starting to plan for.
Bert22306, I just don't follow your logic here. I'm old enough to remember when there wasn't a Moore's Law. We did not have a high-tech military then: no smart bombs, no UAVs, no GPS, no computer-guided anything. Now we do. If our technology stops giving us an advantage, we have a real problem for which there was no counterpart back then.
I think there are both absolute and relative implications of Moore's Law in the military arena. In absolute terms, one can make very capable military systems (say, radio, radar, jammers, etc.) with COTS components. If Moore's Law stopped tomorrow, or went on another 15 years, these systems would still be formidable, in the sense that they first and foremost must deal with Nature, not just adversaries. High power at high frequencies, with very capable FPGAs to do the processing, and algorithms pulled off the internet...the barrier to entry is no longer very high.
My concern about the end of Moore's Law isn't so much about such "absolute" threats. I worry about the relative threats, where an advantage in electronics translates into an operational capability.
Yes, there are many ways of continuing to improve computers, but I claim the sum total of all of them aren't worth a damn compared to the aggregate beneficence of Moore's Law. As I said in my talk, you cannot substitute any number of incremental improvements for the death of an exponential.
I'm a chip architect at heart. Consider the period 1980 - 2010. From my personal experience, chip clocks went from 1MHz to 3.5GHz, a 3500x improvement. How much did architecture and microarchitecture add on top of that? I'll guesstimate 50x - 100x. Admittedly, that's not completely fair, because much of our architect effort went towards making that clock improvement possible, but still, I do think there's a signal in that noise: the underlying exponential came from silicon. Is there 3500x beyond the end of Moore's Law? No way.
Blog That A-Ha Moment Larry Desjardin 10 comments Have you ever had an a-ha moment? Sure, you have. The Merriam-Webster dictionary defines it as "a moment of sudden realization, inspiration, insight, recognition, or ...