PORTLAND, Ore.—When it comes to predicting the future, Moore's Law has been a time-worn bell-weather. But did you know its just one of several competing laws with names like Sinclair-Klepper-Cohen's, Goddard's and Wright's Law?
Overall the best long-term predictor is Wright's Law, which improves its accuracy over Moore's law by framing its horizon in terms of units-of-production instead of absolute time. For instance, Moore's Law predicts that every 18 months the density of semiconductors will double, whereas Wright's Law predicts that as the number of units manufactured increases the cost-of-production decreases (no matter how long that might take). Thus Wright's Law--named after aeronautical engineer, Theodore "T.P." Wright--offers more accurate long-term predictions since it automatically adapts to economic growth rates.
Growth of prediction errors for competing laws to Moore's Law shows Wright's Law the best at long-time horizons, Goddard's Law as the worse at short time horizons, and Sinclair-Klepper-Cohen the worst for long-time horizons.
Wright's and other alternatives, such as Goddard's (which postulates that progress is driven only by economies of scale) and Sinclair-Klepper-Cohen's (which combines Wright's and Goddard's), were compared to the actual cost and production units in 62 different technologies, including computers, communications systems, solar cells, aircraft and automobiles. Historical data allowed accurate comparisons using "hind-casting" whereby a statistical model was developed to rank the performance of each postulated law over time.
MIT claims its results show that with careful use of historical data, future technological progress is forecastable with a typical accuracy of about 2.5 percent per year. The research was conducted by MIT professor Jessika Trancik, professor Bela Nagy at the Santa Fe Institute, professor Doyne Farmer at the University of Oxford and and professor Quan Bui at St. JohnÕs College (Santa Fe, N.M.).
I completely agree with "iniewski". Nevertheless, I predict that when computer chips have shrunk to their physical limit and Moore's Law has clearly reached the end of the line, there will be headlines that a "law" has been violated or found invalid. In any case, the longevity of this trend has been remarkable - in part because nobody wants the trend to end on their watch so the R&D efforts are redoubled when the end of the law's reign seems to be coming. The consumers of electronic products are the winners.
Moore's Law (Trend) is amazing. I have been in the Semiconductor Industry since 1982 and every five years or so (1987, 1993, 1998, 2003, etc. etc.)I hear someone predicting that it will no longer hold -- but it does -- decade after decade.
Failure to account for Moore's Law has resulted in a number of projects being prematurely cancelled. Just because a technology does not yield a cost-effective product today, does not mean that the same product won't be cost effective in 2-4 years. Maybe the product needs to be put on the back burner, but way too often I have seen the project to be killed, only to be resurrected by another company a few years later when the costs become attractive.
I'd also like to point out that once we hit the physics-based limit on feature scaling, we still have the third dimension to keep us busy for a while, as well as ever-larger chip sizes. The actual Moore's law isn't just about speed, nor feature size - it's about computing power available at the most economic price point, which means that bigger, cheaper chips, still count. Once we hit the physics-limit, the foundries that can hit that will eventually get paid off, and chip prices will decrease, giving us an increase of computing power at a given price point, and thus, giving a bit of new life to the observation for a few more years.
While it may have limited applications in our industry, Cole's Law is my preference; it just tastes better. Simply stated, Cole's Law is "Cabbage In = Cabbage Out", with a little special sauce and horseradish.