So what if Moores' Law does expire? It doesn't mean the end of semiconductors or improved processing, it just means it won't be coming from simply shrinking a silicon die. I think the economics will run in a different direction before the physical buffers are hit, but that is just my opinion, and most of the above are just others opinions; there are no facts in this yet (well except maybe for Javiers' piece, worth reading).
What Moore's "Law" originally said was that the number of components that will fit on one chip *at minimum cost per component* doubles every two years (or whatever time).
There's an optimum die size at each process node that minimises cost per gate, depending on density and yield. Cost per gate then depends on wafer cost, which double/triple patterning drives through the roof.
So Moore's law is *already* dead at 20nm, the cost projections even over several years show that cost per gate never falls below 28nm, and the same for 14nm.
The whole industry has been based round the fact that the next process delivers more function for the same price, as well as lower power and higher speed. Once the economic reason to move disappears only power and speed are left, and the improvements in these are slowing down.
Yes it's technically possible to use quadruple patterning to do 10nm, but I don't think anyone thinks it makes sense economically. Without EUV or direct-write (still coming Real Soon Now, like for the last 10 years) it's difficult to see what such processes could be used for -- maybe a few high-margin products (e.g. Intel, Apple) which need billions of transistors or the lowest possible power and are willing to pay a premium for this, but not the vast majority of chips where cost-per-gate is key.
I think Chipmunk is right. Optical chip-to-chip connections could be very important. On-chip components are connected by intra-chip busses. If the off-chip bottleneck is removed by making the chip-to-chip bus the same speed as the intra-chip bus, then the massive integration is no longer as necessary as it is now. Think of a virtual SoC that is underneath several chips glued together by optical interconnects.
Most of the components have already been designed, though not together in one chip yet. Avalanche diode detectors, silicon waveguides and other optical components, a few different all-optical switching methods. Even nano-lasers smaller than the wavelength of the light are possible and have been demonstrated. The missing ingredient is a means to cheaply add a layer to create lasers, as Si is an inefficient laser medium. All-optical RAM is also an ongoing issue.
It will start with chip-to-chip comms, but will quickly move to intra-chip data busses. Eventually, ALUs, etc. will be moved to all-optical components. At some point, electronics itself may be replaced with "optronics".
Even Gordon Moore has expanded his "law" to other interests. We should follow and explore the next application of Moore's Law, that of DNA sequencing: http://www.eetimes.com/electronics-news/4400684/Moore-s-Law-goes-biotech
For people who, checked on Graphene.. I asked this same question to friend of mine who is post-doctoral researcher on graphene at my university. According to him, graphene transistors are not practical.. and is only good for interconnects ( he claims copper interconnects will be replaced) and has some application in making sensors.. In short dont count on graphene :)
Lets be realistic, Moore's law was made with respect to silicon or such semiconductor materials. If we manage to shift to altogether different computing mechanism, then Moore's law is not what we should be speaking of. It should be about performance vs cost.
So pragmatically speaking Moore's law has been decaying since the days gate leakage went past the roof. It will end, be realistic, but that doesn't mean the end of computing.