I probably shouldn't admit this, but I have vivid memories of engineering and electronics design before the personal computer. We're talking antediluvian here, the mid- to late 1970s when guys like Bill Gates and Steve Jobs were still in school and pretty much everything in this world was assembled with discrete components and packages of discrete components, or integrated circuits (ICs).
It didn't take long for the IC to evolve into a complete, programmable computer on a chip. Intel led the way with a 4-bitter, soon followed by the fabled 8-bit 8080. Back then, Intel had scores of microprocessor competitors from RCA to Texas Instruments, National Semiconductor, Zilog and Siliconix.
It was an immensely exciting time when engineers, geeks and home hobbyists formed local computer clubs, swapping chips, Imsai and Altair circuit boards, computer code and home-grown applications. At the time, I was a reporter for what was then the industry's leading electronics publication, and my esteemed editors greeted this new phenomenon in electronics and computing with a collective, ambivalent shrug.
But I couldn't go anywhere without running into brilliant people who had been bitten by the microcomputer bug. What I remember most about the dawn of the microcomputer was the aura of boundless, soaring optimism and sense of exploration and unlimited possibility that surrounded this grassroots movement.
Applications for these newfangled computers seemed endless. Embedded microprocessors turned the staid world of instrumentation on its head, and it seemed as though we were on the threshold of a cosmic revolution of ubiquitous computer control and communications.
Alas, computerdom's utopian promise was short-lived. Soon, Big Blue and a Faustian geek named Gates formed a pact with the devil, and the IBM PC was born. Soon there came to be one dominant chip made by Intel and one OS made by Microsoft, and this de facto standard would rule the world for the next 20 years. A hefty dose of corporate weed killer stopped the grassroots computer movement dead in its tracks.
Fast forward to the future. Today, the PC is graying at the temples, a mature architecture and a mature business that's about to become a Chinese commodity. What has taken its place? An explosion in embedded computing, signaling an immensely exciting time when system builders can assemble powerful new applications using open-source software, off-the-shelf microcontroller subsystems and a spectrum of wireless, networking and data communications component technologies to solve real-world problems.
Thanks to embedded processors, smart sensors, the Internet, digital signal processors, Java, Global Positioning Systems, opto electronics, etc., we are again on the threshold of a new age promising fast, powerful processing power that will change the world. Once again, I sense a boundless, soaring optimism and spirit of exploration and unlimited possibility. Like the pre-PC era, today's embedded computing landscape is an open, level playing field not locked into one architecture, one OS or one global standard.
This is no utopia, and it's not just a grassroots, Woodstock computer revival. This is the shape of tomorrow. As solution providers and system builders, you are on the cutting edge of the next big thing in electronics.
You've come to a fork in the road. As Yogi Berra would say, "Take it." This is a chance for you and your customers to create the future.
Richard Wallace is Editorial Director, EE Times Network. He can be reached at firstname.lastname@example.org.