PORTLAND, Ore.—When it comes to predicting the future, Moore's Law has been a time-worn bell-weather. But did you know its just one of several competing laws with names like Sinclair-Klepper-Cohen's, Goddard's and Wright's Law?
Overall the best long-term predictor is Wright's Law, which improves its accuracy over Moore's law by framing its horizon in terms of units-of-production instead of absolute time. For instance, Moore's Law predicts that every 18 months the density of semiconductors will double, whereas Wright's Law predicts that as the number of units manufactured increases the cost-of-production decreases (no matter how long that might take). Thus Wright's Law--named after aeronautical engineer, Theodore "T.P." Wright--offers more accurate long-term predictions since it automatically adapts to economic growth rates.
Growth of prediction errors for competing laws to Moore's Law shows Wright's Law the best at long-time horizons, Goddard's Law as the worse at short time horizons, and Sinclair-Klepper-Cohen the worst for long-time horizons.
Wright's and other alternatives, such as Goddard's (which postulates that progress is driven only by economies of scale) and Sinclair-Klepper-Cohen's (which combines Wright's and Goddard's), were compared to the actual cost and production units in 62 different technologies, including computers, communications systems, solar cells, aircraft and automobiles. Historical data allowed accurate comparisons using "hind-casting" whereby a statistical model was developed to rank the performance of each postulated law over time.
MIT claims its results show that with careful use of historical data, future technological progress is forecastable with a typical accuracy of about 2.5 percent per year. The research was conducted by MIT professor Jessika Trancik, professor Bela Nagy at the Santa Fe Institute, professor Doyne Farmer at the University of Oxford and and professor Quan Bui at St. JohnÕs College (Santa Fe, N.M.).
The problem is not with the accuracy of Moore's law or other laws. The problem is with your misinterpretation of the word. A "law" is not a fundamental of nature. It is simply a statement or formula. It can be completely wrong and still be a law.
What's interesting is of course that Moore's law predicts a doubling every 22 months, not 18 months. I heard it from the man directly at a talk back in 2005 at the computer history museum. He was being interviewed by Carver Mead. Google the following "Computer History Museum Presents The 40th Anniversary of Moore's Law with Gordon Moore and Carver Mead." There is a video of the interview somewhere too.
If memory serves he explained that originally he had said it would double every two years but a few years later scaled it back by two months to doubling every 22 months. A few years later, Intel's marketing started a campaign in which they were saying that processor performance would double every 18 months (likely from refinements and jumps to half-step processes). In the talk, Moore explained the two numbers got confused and thus most folks wrongly say Moore's law predicts a capacity doubling every 18 mos. It is actually every 22.
That said, what does it mean for this study?
You can take the reporter out of the investigation, but you can't take the investigation out of the reporter.
It's possible that Moore's law may even be applicable into the future, aided of course by Necessity is the Mother of Invention Law. How many nanometer interations are left in silicon, and what will happen after that?
We need another 'Law', that tracks the media's obsession with the 'sound-bite', and desire to appear educated, to try to disguise the vacuous fluff they write about.
Thus there is an increasing divergence of the claims from reality.
My Physics teacher, always pointed out that extrapolation was dangerous.
Moor's law is based on a short term and totally unustainable condition that market demand drives technological advances. We are begining to hit the limits of they physics, but have not yet even come close to the limits of the marketed hardware. The hard question is whether there can be any breakthroughs the equivalent of phase shift masks and the like. Crystaline like quantum logic structures appear to be a long long ways off.
What do you mean by exact science? Exact like in Newton's equation for gravity or Einstein's later revised one?
Nothing in life is exact science. Not even ohm's law. But it works well enough today to be a useful tool for getting work done. Moore's law does the same thing and for whatever observable reason it may be.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.