PORTLAND, Ore.—When it comes to predicting the future, Moore's Law has been a time-worn bell-weather. But did you know its just one of several competing laws with names like Sinclair-Klepper-Cohen's, Goddard's and Wright's Law?
Overall the best long-term predictor is Wright's Law, which improves its accuracy over Moore's law by framing its horizon in terms of units-of-production instead of absolute time. For instance, Moore's Law predicts that every 18 months the density of semiconductors will double, whereas Wright's Law predicts that as the number of units manufactured increases the cost-of-production decreases (no matter how long that might take). Thus Wright's Law--named after aeronautical engineer, Theodore "T.P." Wright--offers more accurate long-term predictions since it automatically adapts to economic growth rates.
Growth of prediction errors for competing laws to Moore's Law shows Wright's Law the best at long-time horizons, Goddard's Law as the worse at short time horizons, and Sinclair-Klepper-Cohen the worst for long-time horizons.
Wright's and other alternatives, such as Goddard's (which postulates that progress is driven only by economies of scale) and Sinclair-Klepper-Cohen's (which combines Wright's and Goddard's), were compared to the actual cost and production units in 62 different technologies, including computers, communications systems, solar cells, aircraft and automobiles. Historical data allowed accurate comparisons using "hind-casting" whereby a statistical model was developed to rank the performance of each postulated law over time.
MIT claims its results show that with careful use of historical data, future technological progress is forecastable with a typical accuracy of about 2.5 percent per year. The research was conducted by MIT professor Jessika Trancik, professor Bela Nagy at the Santa Fe Institute, professor Doyne Farmer at the University of Oxford and and professor Quan Bui at St. JohnÕs College (Santa Fe, N.M.).
Which is why even Intel finally loves Linux on x86, finally, Duane [my opinion, not my employer Intel's].
AFA Moore's law, whether it was ever "true" or not, it sure drove a revolution we've all benefited from. I'm not sure whether Wright's correlation really means a whole lot, it's one of those "yeah, so?"s.
Gordon Moore caught the imagination of a generation of engineers, and the rest is history.
There's also Duane's law, which is corollary to the two-car garage syndrome. Even if not by name, many people are likely familiar with. It states that no matter how large your garage is, the amount of stuff you accumulate will fill it such that there is never more than just less then enough room required to put one car in.
That, by inference leads to Duane's law which postulates that no matter what the performance of your computer system, the operating system will lug it down to the point at which it performs at a level just slightly less than the first computer you ever owned. Unfortunately, Duane's law has a tendency negate the effects of Moore's law and presumably the others.
A physical law,according to the Oxford English dictionary, is "a theoretical principle deduced from particular facts, applicable to a defined group or class of phenomena, and expressible by the statement that a particular phenomenon always occurs if certain conditions be present."
John Nash received a Nobel Prize on game theory with lots of mathematical equations describing how humans interact. As engineers, let's not always look at things so black and white.
Moore's law might have been based on observation at first, but it has changed. It is now nothing more than an implicite objective followed by semiconductor companies. They adapt their efforts in order to reach this objective. If Moore's law has been "correct" for so long it is only because SC companies made it correct. I don't think it is of any statistical relevance nowadays.
Moore's law is not a law,people don't give a shit about it! it is a trend for the company to follow to make money which in this case flawed due to major miscalculation of not taking into account of the whole semiconductor eco-system.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.