n the mid-1990s, just as the electronics industry was heading into a wonderful resurgence, I attended a lecture by Gordon Moore.
In the mid-1990s, just as the electronics industry was heading into a wonderful resurgence, I attended a lecture by Gordon Moore. The famed co-founder of Intel Corp. posed a question that seemed to be in stark contrast to the rosy economic outlook of the day. Reflecting on the early days of integrated-circuit technology, Moore pointed out that there was no reason to expect that integration density would proceed along a never-ending geometric growth curve, and he remained puzzled as to the fundamental factors that had sustained the VLSI revolution. He speculated that an accidental constellation of physical factors had come together in the element silicon to make it a favorite for integrated-circuit processes. And in that case, he said, it might be that with CMOS technology we could be witnessing the end of an era that would never be repeated.
Some might argue that such an event might be a good thing. Geometric growth in circuit density is the driver of the electronics industry. But the industry has done a good job of undermining its own economic base. Prices of electronics goods plunge as rapidly as they proliferate through society. The result is an abbreviated time-to-market saturation combined with decreasing profits for producers. Since this is great for consumers, it's difficult to see that this type of economic dynamics might be a bad thing. But it puts business planners in an increasingly powerful vise.
In fact, businesses trying to make a profit from the traditional manufacturing and retail models may have arrived at the breaking point.
Economic theory has yet to catch up to the dynamics of the electronics industry, which has a tradition of defying conventional economic definitions and methods. Take the fairly simple problem for technologists of defining system performance. Granted that Mips and gigaflops do not tell the whole story of computer performance, but they at least form a starting point for discussion. But for economists, higher performance is meaningless if it does not lead to higher productivity, and there are a large number of other economic parameters that performance might affect.
So if next year's computer comes in a package that's pretty similar to this year's, with the same number of chips, disk drives and a comparable display, performance is the only differentiator. And that leaves performance, which is locked up in the faster chips that replaced last year's model, as the only way to describe the computer's increase in value. But it has been difficult to find a stable economic definition of computer performance.
That definition changed again in the early 1990s, according to some analysts, when the Internet put a whole new spin on performance. Users no longer needed a high-end computer to log on to the Internet and access a search engine. It became entirely free, essentially magnifying the performance of desktop machines to supercomputer level.
For example, I can read the front pages of all the major newspapers, network with colleagues, buy goods and services and search research libraries anywhere on the planet with that same relatively low-performance computer. Before the Internet, I would engage in those same activities, but on a much longer time scale. As one unit in some economic model, I have suddenly amplified my interaction with the outside world without having to go back to school or make a major investment in new tools. This is just one example of how the problem of defining the added performance value of a computer is increasing at a far faster rate than efforts to define it.
All of this is cause for worry, since the dynamics that drive the market are as mysterious today as they have always been. Are we in just another business cycle, or has the disruptive power of the integrated circuit finally tipped the boat over?
It is interesting to note that the Great Depression followed the prosperity engendered by Henry Ford's brilliant manufacturing-line innovation in 1908. Patterning their assembly lines on his invention, other manufacturers began to produce a flood of products that just about anyone could afford. Ford tackled more than the technological and engineering problems of setting up a manufacturing line, however. He also realized that there needed to be an expanding market for selling his cars, forestalling the possibility of a market glut that would drive down prices.
Henry Ford's solution was to pay his workers higher wages, thus enabling them to buy the Model T's rolling off the assembly lines. And for a while that approach worked splendidly.
But by 1929, an opposing dynamic had set in. Workers were out of jobs and wages were depressed. Worst of all, the performance-enhancing aspects of assembly-line production became irrelevant when businesses couldn't find anyone to buy their products.
The current economic situation has some disturbing parallels with the Great Depression, and some economists are warning that we might be on the verge of another worldwide deflationary dynamic. But there are many different factors today, such as very large world markets, far better communications and, of course the integrated circuit.
Nature seems to prefer geometric growth. Cell division and the multiplication of individuals in a species are primary drivers for living systems. In addition, complex feedback systems combine with accelerating growth curves to keep the entire system far from equilibrium. Despite that chaotic and seemingly out-of-control system design, nature's system has worked remarkably well for billions of years. Now if we only understood how it works, we might be able to better grasp the rapidly expanding modern industrial scene, which is now global and highly interconnected.
The glut in computational capacity we are now experiencing may actually work in our favor. Researchers are putting that capacity to work to unravel the principles of the highly nonlinear dynamics of current industrial and biological systems. Maybe the integrated circuit will be our salvation after all.
By Chappell Brown (firstname.lastname@example.org), managing editor of Technology for EE Times