Stephen Brennan's guest column on the stimulus left me shaking my head. Here's my attempt at setting the record straight.
EE Times has done a great job of covering Obama's stimulus bill, but it carried a guest column that left me shaking my head. Pretty much every paragraph got something wrong—a real surprise considering that the author, Stephen Brennan, is a principal of a money management company.
Since this recession is affecting essentially everybody, I think you deserve to hear the real story. If nothing else, it will help you know what to expect in the coming months. In that sprit, here is Stephen's column and my comments:
President Obama has just signed into law his economic stimulus package approved by Congress last week. The bad news is that it will speed an economic recovery about as much as the New Deal cured the Great Depression or massive government spending and direct liquidity injections into banks reignited Japan's economy in the 1990s.
It doesn't make sense to lump the New Deal and 1990s Japan together. It's true that both are examples of government interventions, but otherwise they have little in common. The outcomes of the two approaches were also very different. Japan experienced a lost decade, but most economists consider the New Deal a success.
Consider this: Between 1933 and 1936, unemployment fell by roughly 10%. Today's labor force is just over 155 million, so a 10% drop in unemployment would equate to 15.5 million jobs. That is a huge number of jobs. To put that in context, Obama expects his stimulus to create 3.5 million jobs. We would be lucky if Obama's stimulus was "only" as successful as the New Deal.
In fact, the President's cure is bound to hurt more than it helps…
I get that some people are unhappy with the stimulus bill, but it's pretty extreme to say it will make things worse. You need a strong argument to back up that kind of claim. As we'll see, Stephen fails to deliver.
… After all, it is hard to fix something you don't understand. And there are few signs that policy makers understand our current economic predicament, and in particular the impact the microprocessor life cycle has had on the economy with its four stages of introduction, growth, maturity and decline.
To understand today's problem and the pivotal role technology plays in the economy, a quick history lesson is necessary. The microprocessor's growth stage began with the introduction of the first PC by IBM in 1981. It ended with the death of classical scaling in 2000. Along the way, there were real increases in worker productivity, product innovation and wealth creation. Government created money, the financial system lent it and new, innovative companies like Intel, Cisco and Oracle rose to prominence around the microprocessor.
The period from 1982 until 2000 is often called the greatest secular bull market in American history, as the stock market rose about 1,200 percent.
Where to begin… It's certainly true that the stock market had a good run from 1982 to 2000, but why give the microprocessor all the credit? Many other important things happened in this period, including a sustained decline in interest rates. There were also major changes in regulations, tax laws, outsourcing, energy prices, etc.
And of course Stephen conveniently ignores things like the 1987 Black Monday crash or the huge market growth between 2003 and 2008. Good luck explaining either of those with microprocessors.
The trouble started when the microprocessor life cycle entered maturity. Immense productivity benefits became harder to achieve, and good investment opportunities dried up Still, government monetary policy remained accommodative, resulting in cheap money going into dot com startups with no real value.
After the inevitable crash in 2000, the government responded by lowering interest rates and making monetary policy even more accommodative. Once again there were still no good investment opportunities. This time the money flowed into residential real estate, and other assets like oil. False wealth was created based on cheap money and speculation, rather than innovation and productivity.
During the microprocessor's growth stage, there was a significant need for capital and the ability of the economy to generate high growth rates with low structural unemployment. As the microprocessor life cycle matured, the need for capital and the growth potential of the global economy declined. Pushing capital into the system that could not be productively invested only created speculative asset bubbles. There is a good reason companies from Exxon to Microsoft built up big cash reserves over the last several years: There are no good investments for immense sums of money in today's economy.
Oh boy. Let's start with the assumption that productivity is tied to processor speed. That's a good assumption if you're trying to compile a huge RTL design. But if you're writing a Word document, 1 GHz is just as good as 2 GHz. Even when processor speed is important, plenty of other things matter, like memory size, memory speed, system cost, power, etc. All of these areas have been moving forward even as processor clock rates stalled out.
Stephen is also overlooking the fact that productivity gains can come from software. Put a browser on a machine, for example, and it suddenly becomes more useful.
Finally, the idea that there are no good investments left is just plain silly. Let's take his two examples of Exxon and Microsoft. For Exxon, how about investing in battery technology, bio-diesel, wind, solar, or any of the other booming areas of clean tech? It's not like there is a shortage of opportunities—clean tech VC funding grew by 52% last year. As for Microsoft… if there are no good ideas left, how did MS competitors Apple and Google manage to do so well last year?
This is why the government's spending plan is doomed. Even if the Obama administration and Congress could more productively invest money than the private sector (a big "if"), high historical growth rates require that a big innovation like steam power, electricity, the internal combustion engine or the microprocessor be entering its growth stage.
First, the private sector isn't investing, so the public vs. private argument is moot. There is no private investment to crowd out!
Second, there are plenty of big innovations entering their growth phase, including clean tech and biotech.
Finally, I don't see why you need a moajor innovation to drive growth. The US has underinvested in so many areas that we could have zero innovation and still grow by fixing our existing systems. (I'm thinking of things like our roads, rail systems, and power grids—all of which are in bad shape compared to most developed countries.)
Of course government should do something. A working financial system is necessary for lending capital to the innovators of tomorrow. The toxic securities clogging the system should be cleared out by the Treasury Department. But throwing money at the economy is how we got into this mess. Moving from overly accommodative monetary policy to profligate fiscal policy will only crowd out the private sector, set the stage for poor economic performance and possibly spark inflation.
Let's be clear: There is plenty of blame to go around, but the banks are a big part of our current troubles. Nobody forced them to make risky loans. In fact, they fought the tooth and nail for the right to make these loans. Why the heck should we give them billions of dollars to clear these assets off their books?
The good news is that economic cycles have their ups and downs. At some point, a disruptive technology will enter its growth phase. Yet such a development also carries economic risks. The microprocessor has become a largely commoditized technology, and MPU makers lack pricing power. With the emergence of a new technology the risks of consumer inflation will rise.
Let me see if I got this straight: MCU makers lack pricing power, so MCU prices go down. Therefore you get… rising prices!?! What the heck? How does that make any sense?
Or maybe Stephen is saying that a new technology will cause inflation. How exactly would that happen? Is somebody going to invent a technology that makes workers less productive? If they did, why would anybody use it? This is hands-down the most bizzare paragraph of the whole column.
That day still appears to be over the horizon. Previous secular bear markets tended to more than a decade. Even if one optimistically dates the start of this period as 2000, we are still likely to be in for some additional difficult years characterized by economic false starts.
—Stephen Brennan (firstname.lastname@example.org) is a principal of Financia Capital, a San Francisco-based money management company.
Is Stephen's prediction right? I'll let you draw your own conclusion on that one.