Performance progress has traditionally been lead by advances in semiconductor process but if we are approaching a plateau (and I'm optimistic that progress will be challenging both scientifically and financially, not things grinding to a halt), then we still have algorithm and architecture in which to innovate. The challenges to advancing in semiconductor process also challenges incumbent RTL-based design since larger system level design perspectives will be required. Yes, that horse has been beaten for a long while but there is growth (starting from an admittedly small base) as more and more people "get it" and turn to ESL.
For the others to catch up to US in technology, the US would have to be essentially static for an extended period of time. You can be assured that is probably not the case. In fact, it is probably more likely the others will try to stay a few steps back from the US leading edge which they fear is fraught with unknown risks they cannot learn fast enough about.
I'd be more worried about something all of mankind gets stuck on, due to fundamental physics, like related to quantum or entropy.
Bert: well, I respectfully disagree with your conclusion. Moore's law leads to faster chips, but the military tends to use the most advanced chips -- which are more expensive even under Moore's Law. That favors the richer country, which economists say will be China in the not-too-distant future. If Moore's law ends, and chips stop get faster at that rate, that advantage is narrowed or -- in the extreme -- even eliminated.
Of course, the wealthier country could still benefit from other advanced technologies that would speed up processing outside the chip. And all major countries already have enough firepower to blow up the planet several times over, so this is probably academic.
On the civil liberties front, the demise of Moore's Law could limit the mass processing of big-data to the point where there is no instant analysis of individuals based on "total information awareness" programs. That would cut into the expansion of intelligence programs like the NSA's.
I agree with your assessment of a quantum computer. However, technology being designed to implement a quantum computer could easily transfer to classical computers. Particularly, the silicon photonics that Intel, BM, and several others are working on could facilitate a paradigm shift such as suggested by Kurzweil.
What if an optical chip-to-chip data bus was as fast as the on-chip data bus? Just as we today connect a component implemented in one area of the die to another component in another area of the die, a chip-speed optical bus would allow inter-die components. This would be a paradigm shift. Instead of distributed computing using clusters of individual and independent computers, we could treat individual dies as a single virtual die, or in other words a very large SoC. In addition, this virtual SoC would be scalable to fit in whatever power envelope was available. There are many possibilities, but this one seems quite feasible in the near future.
"So, what happens if the whole world has equal access to technology? Does stability depend on one country having a bigger stick than everyone else?"
The truth is precisely the opposite, peace takes place when there is an equilibrium of power, and that happens when there is equal access to technology, not when one country has a bigger stick than everyone else.
The American expert is right in seeking US advantage in high technology, but that is not necessarily in the interest of world peace :-) Others have to seek the same advantage and at some stage, they will realize that their interests lie in collaborating and cooperating rather than constantly seeking an advantage over the others. It's a process and we are nowhere near maturity....
Could not agree more. I say bring it on! Our over-reliance on transistor level improvements over 30 years or so made us LAZY, but what's 30 years in the history of human progress? nothing. We ought to look at other levels including:
- Algorithmic: for many decades now, our way of thinking about problem solving has been biased towards Von-Neuman implementation platforms with semiconductor chups. Let's look beyond that and devise new algorithms for wider platforms and paradigms.
- Architectural: Hardware is not just about transistors, it's about computing and communication designs and architectures. I do not think we have explored the realm of possibilities here adequately, there is still a lot to be done.
-Physical: Binary electronics using semiconductors is one of many possibilities for computing, storage and communication. Here again, we have scratched the surface.
To solve our computing, storage and communications needs, we must train a new breed of scientists and engineers. Out with the modularization, fragmentation and specialization of training and teaching, and in with holistic education.
I believe that Moore's law as *conventionally* stated will stop (transistors shrinking and # of transisitors doubling every N years).
However, I feel that there will be innovations that will help continue forth with improvements in performance, power consumption and functionality (recall that these are the *end* objectives that we are really interested in. Scaling to smaller dimensions has just been the *means* of achieving this *end*).
And I feel that these innovations will enable using the same or maybe slightly *longer* channel length transistors (say, 45 nm) than where we are headed towards (longer channel lengths imply better yields), and yet result in better performance and lower power. These innovations may be in the form of using a different material than Silicon, etc.
Before you can declare Moor'es Law dead, consider what ti reall is, a prediction. Then there's law. Take Ohm's law. It seems to be irrefutable. V=IR. It seems to be universal, even in space. Then there are "laws" that governments pass. They can be revoked, just as Moor'es "law." Law passed by governments are really more rules than laws. If it can be revoked, it's not a law in the first place.
Too many contradictions in these arguments, Tom. If Moore's Law continues to hold, it means that very soon everyone can afford these faster chips. Deep pockets or no. If Moore's Law stops or slows down, the faster technology will become expensive, available only to those with deep pockets. That seems to be what's happening already.
The effect on "national security" has to be that the richer a country is, the better off it is militarily *without* Moore's Law!!
Besides, I think that the term "national security threat" is being thrown about too loosely, used to justify way too many questionable things lately.
Blog Doing Math in FPGAs Tom Burke 14 comments For a recent project, I explored doing "real" (that is, non-integer) math on a Spartan 3 FPGA. FPGAs, by their nature, do integer math. That is, there's no floating-point ...