PORTLAND, Ore.—When it comes to predicting the future, Moore's Law has been a time-worn bell-weather. But did you know its just one of several competing laws with names like Sinclair-Klepper-Cohen's, Goddard's and Wright's Law?
Overall the best long-term predictor is Wright's Law, which improves its accuracy over Moore's law by framing its horizon in terms of units-of-production instead of absolute time. For instance, Moore's Law predicts that every 18 months the density of semiconductors will double, whereas Wright's Law predicts that as the number of units manufactured increases the cost-of-production decreases (no matter how long that might take). Thus Wright's Law--named after aeronautical engineer, Theodore "T.P." Wright--offers more accurate long-term predictions since it automatically adapts to economic growth rates.
Growth of prediction errors for competing laws to Moore's Law shows Wright's Law the best at long-time horizons, Goddard's Law as the worse at short time horizons, and Sinclair-Klepper-Cohen the worst for long-time horizons.
Wright's and other alternatives, such as Goddard's (which postulates that progress is driven only by economies of scale) and Sinclair-Klepper-Cohen's (which combines Wright's and Goddard's), were compared to the actual cost and production units in 62 different technologies, including computers, communications systems, solar cells, aircraft and automobiles. Historical data allowed accurate comparisons using "hind-casting" whereby a statistical model was developed to rank the performance of each postulated law over time.
MIT claims its results show that with careful use of historical data, future technological progress is forecastable with a typical accuracy of about 2.5 percent per year. The research was conducted by MIT professor Jessika Trancik, professor Bela Nagy at the Santa Fe Institute, professor Doyne Farmer at the University of Oxford and and professor Quan Bui at St. JohnÕs College (Santa Fe, N.M.).
No, there are actually four types of physical laws: Definitions (e.g. F=ma, which defines the concept of force), Empirical (which is strictly observational), and Theoretical (which embodies an understanding), and derived (which can be mathematically produced from other laws, e.g. torque). Derived laws stand and fall with the laws they are derived from. Theoretical laws apply wherever the understanding they are based on applies. Definitions are true no matter what; the question with definitions is not "are they true" but "are they relevant".
I think people use "laws" out of optimism--they really want to improve our predictive powers and "wish" that they could come up with a black-or-white "law" to help out. Of course, as you say, the world is shades of grey :)
The Higgs Boson was confirmed based upon a statistical observation and now most particle physicists believe the standard model to be accurate. I don't see a problem with claiming something is a law if the statistical analysis supports the claim. Even our own existence is based upon statistical interactions with photons. If you don't believe that, turn off all the light in your bedroom tonight and tell us tomorrow if you could still see yourself in the mirror.
Moore's law is not a law,people don't give a shit about it! it is a trend for the company to follow to make money which in this case flawed due to major miscalculation of not taking into account of the whole semiconductor eco-system.
Moore's law might have been based on observation at first, but it has changed. It is now nothing more than an implicite objective followed by semiconductor companies. They adapt their efforts in order to reach this objective. If Moore's law has been "correct" for so long it is only because SC companies made it correct. I don't think it is of any statistical relevance nowadays.
A physical law,according to the Oxford English dictionary, is "a theoretical principle deduced from particular facts, applicable to a defined group or class of phenomena, and expressible by the statement that a particular phenomenon always occurs if certain conditions be present."
John Nash received a Nobel Prize on game theory with lots of mathematical equations describing how humans interact. As engineers, let's not always look at things so black and white.
What do you mean by exact science? Exact like in Newton's equation for gravity or Einstein's later revised one?
Nothing in life is exact science. Not even ohm's law. But it works well enough today to be a useful tool for getting work done. Moore's law does the same thing and for whatever observable reason it may be.
There's also Duane's law, which is corollary to the two-car garage syndrome. Even if not by name, many people are likely familiar with. It states that no matter how large your garage is, the amount of stuff you accumulate will fill it such that there is never more than just less then enough room required to put one car in.
That, by inference leads to Duane's law which postulates that no matter what the performance of your computer system, the operating system will lug it down to the point at which it performs at a level just slightly less than the first computer you ever owned. Unfortunately, Duane's law has a tendency negate the effects of Moore's law and presumably the others.
Which is why even Intel finally loves Linux on x86, finally, Duane [my opinion, not my employer Intel's].
AFA Moore's law, whether it was ever "true" or not, it sure drove a revolution we've all benefited from. I'm not sure whether Wright's correlation really means a whole lot, it's one of those "yeah, so?"s.
Gordon Moore caught the imagination of a generation of engineers, and the rest is history.
Two criticisms: 1) Wright's law is worthless as a prediction; it is parametric in units. You need the prediction in time. So I have to have an Oracle predictor of units over time, and if I have that, who needs Wright? Not surprised it came out best in "hindcasting".
2) With Moore's Law, don't confuse cause and effect. Moore's Law is not an immutable cause. It was the target to which the semiconductor industry managed itself to, thus, get the doubling effect.
Moor's law is based on a short term and totally unustainable condition that market demand drives technological advances. We are begining to hit the limits of they physics, but have not yet even come close to the limits of the marketed hardware. The hard question is whether there can be any breakthroughs the equivalent of phase shift masks and the like. Crystaline like quantum logic structures appear to be a long long ways off.
We need another 'Law', that tracks the media's obsession with the 'sound-bite', and desire to appear educated, to try to disguise the vacuous fluff they write about.
Thus there is an increasing divergence of the claims from reality.
My Physics teacher, always pointed out that extrapolation was dangerous.
It's possible that Moore's law may even be applicable into the future, aided of course by Necessity is the Mother of Invention Law. How many nanometer interations are left in silicon, and what will happen after that?
What's interesting is of course that Moore's law predicts a doubling every 22 months, not 18 months. I heard it from the man directly at a talk back in 2005 at the computer history museum. He was being interviewed by Carver Mead. Google the following "Computer History Museum Presents The 40th Anniversary of Moore's Law with Gordon Moore and Carver Mead." There is a video of the interview somewhere too.
If memory serves he explained that originally he had said it would double every two years but a few years later scaled it back by two months to doubling every 22 months. A few years later, Intel's marketing started a campaign in which they were saying that processor performance would double every 18 months (likely from refinements and jumps to half-step processes). In the talk, Moore explained the two numbers got confused and thus most folks wrongly say Moore's law predicts a capacity doubling every 18 mos. It is actually every 22.
That said, what does it mean for this study?
You can take the reporter out of the investigation, but you can't take the investigation out of the reporter.
The problem is not with the accuracy of Moore's law or other laws. The problem is with your misinterpretation of the word. A "law" is not a fundamental of nature. It is simply a statement or formula. It can be completely wrong and still be a law.
I completely agree with "iniewski". Nevertheless, I predict that when computer chips have shrunk to their physical limit and Moore's Law has clearly reached the end of the line, there will be headlines that a "law" has been violated or found invalid. In any case, the longevity of this trend has been remarkable - in part because nobody wants the trend to end on their watch so the R&D efforts are redoubled when the end of the law's reign seems to be coming. The consumers of electronic products are the winners.
Moore's Law (Trend) is amazing. I have been in the Semiconductor Industry since 1982 and every five years or so (1987, 1993, 1998, 2003, etc. etc.)I hear someone predicting that it will no longer hold -- but it does -- decade after decade.
Failure to account for Moore's Law has resulted in a number of projects being prematurely cancelled. Just because a technology does not yield a cost-effective product today, does not mean that the same product won't be cost effective in 2-4 years. Maybe the product needs to be put on the back burner, but way too often I have seen the project to be killed, only to be resurrected by another company a few years later when the costs become attractive.
I'd also like to point out that once we hit the physics-based limit on feature scaling, we still have the third dimension to keep us busy for a while, as well as ever-larger chip sizes. The actual Moore's law isn't just about speed, nor feature size - it's about computing power available at the most economic price point, which means that bigger, cheaper chips, still count. Once we hit the physics-limit, the foundries that can hit that will eventually get paid off, and chip prices will decrease, giving us an increase of computing power at a given price point, and thus, giving a bit of new life to the observation for a few more years.
While it may have limited applications in our industry, Cole's Law is my preference; it just tastes better. Simply stated, Cole's Law is "Cabbage In = Cabbage Out", with a little special sauce and horseradish.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.