# Why I ignore most "extrapolations"

It's a lazy technique, and ignores reality and limits

The other day, I heard yet another alleged expert proclaiming "at this rate, in 10 years, we'll be at such-and-such point". [I know don't remember exactly what the subject was; it doesn't matter.] Guess what? I now tune out almost every pundit or futurist who uses the phrase "at this rate."

Why? Simple: such extrapolations are an easy but lazy way of emphatically making a point, but are usually based on meaningless straight-line extensions of data or, worse, exponential trending. As engineers, we know that trends cannot continue forever nor even very long, that they rarely continue in a straight line, and that even a modest fixed-percentage change in some variable compounds to a very large (or small) number fairly quickly. For example, if you look at statistics which show global population growth of say, 4% per year, and the Earth's population now is 6 or 7 billion, you could glibly say that the population will double in 17.5 years to between 12 and 14 billion.

While that is mathematically true, it will happen in the real world only if you accept as unchanging fact that the rate of increase is fixed with nothing to slow it down, or even stop it. We know that reality is that boundaries, limits, feedback loops, and asymptotes push back against such easy trend-line extensions, whether it is semiconductor process trends, power consumption, die size, or project management.

So when an industry expert exclaims, for example, that product-design cycles are shrinking, and have gone from 24 months down to eight months in the past five years, and then draws a conclusion by adding something like "if this trend continues. . . .", I ignore the rest. OK, what will happen if this trend continues? Will the product-design cycle time go to zero? Will it go negative? Obviously, there is some limit on how short a new-product cycle can be, and it's somewhat greater than zero. And even a basic parameter such as temperature rise can't go on for too long.

In his excellent book *Innumeracy: Mathematical Illiteracy and Its Consequences*, John Allen Paulos explored this issue of misuse and misunderstanding of basic numerical concepts (the web site **here** is fascinating) and the situation has gotten worse since the book was published in 1988. I think there are several reasons for this:

- the decline in basic mathematics education,
- the ease with which numbers and statistics are now generated and overanalyzed,
- and perhaps most important, the need for lobbyists, fund-seekers, activists for all sorts of causes, and the media to try get attention and stand out from the never-ending flow of data, and get above the background noise, by crying "wolf" more loudly and more often.

Or, being cynical here: it's very often a way of saying "at this rate, we're all going to die soon due to xyz, if you don’t give us more funding" while trying to add some credibility from the imprimatur of mathematical certainty.

What quasi-mathematical statements turn you off, or do you routinely ignore. What examples of innumeracy do you see which especially annoy you, either within the engineering community or in the general media? ♦