By Jim Turley, Embedded Systems Programming
Goodbye binary arithmetic,
instruction sets, and assembly language programming. Hello 4.5-billion
transistor, 250GHz processors. O brave new embedded world!
There are 10 kinds of people in the world: those who understand binary and
those who don't. Those of us who do understand binary (and who get the joke)
will be fewer and farther between in a world 15 years hence. Like the ability to
tell time on a clock with hands or tie shoelaces without Velcro, an
understanding of binary arithmetic or assembly-level programming will be a lost
and forgotten skill.
That's a good thing. Few mourn the loss of broadswords, 8-inch floppies, or
phrenologists. The purpose of embedded technology is to blend into the
background, to become an everyday part of our lives. Our grandchildren are not
likely to become programmers in the sense that we understand it, any more than
most of us became colliers, lamplighters, or mule-team drivers.
How far is fifteen
When predicting the future, it's pretty tempting to
extrapolate from existing data points. Fifteen years ago was 1988. Cheers
was still on TV, Roger Rabbit was in movie theatres, and Bobby McFerrin
had the #1 record. Motorola's 68030 chip was brand new and not yet considered an
embedded processor. Intel's 25MHz 486 was still a year away; 386-based PCs
humming a 16MHz tune were at the top of the charts. The bizarre Inmos Transputer
chip was in its fourth (and nearly final) generation. In the burgeoning RISC
camp, the 88100 was new, shoring up the early 88000. Berkeley's (and later
Sun's) SPARC dynasty had just begun; Stanford's MIPS project was only slightly
older. National's 32000 architecture had circled the drain for the last time,
proving that "elegance is everything" except what's needed to survive.
PowerPC hadn't been invented yet. That gleam didn't enter IBM's eye for
another three years, and the first 601 chip was two years beyond that.
Macintoshes were straining 68020s and '030s. England's ARM had yet to discover
royalty as the basis for its business; the first ARM6 chips were still five
years over the horizon. OS/2 adorned some IBM machines, to profound disinterest
from customers. Steve Jobs' NeXT introduced its first black box with a CD-ROM,
no floppy drive, and something called Display PostScript.
In 1988 we had yet to see a million-transistor microprocessor chip. That
distinction was to go to Intel's 486DX the following year. Now 15 years on, a
mainstream Pentium 4 processor contains 45 million transistors, but that's not
half the complexity of graphics chips from nVidia and ATI, with 125 million
transistors. In 2002, about 6 billion new processor chips were made and sold"one
new processor for every man, woman, and child on the planet. In total, something
like 60 million transistors were fabricated for every human. Semiconductor
transistors are as plentiful as grains of rice, and almost as cheap.
Connecting the dots is easy. Drawing a curve from yesterday's 16MHz chips to
today's 3GHz parts suggests we'll have 250GHz processors in fifteen years' time.
The transistor trend line passes through 4.5 billion transistors per processor
Swell, but what does it all mean?
Extrapolating from 15 years ago suggests that some of
2018's most popular microprocessors haven't been invented yet. That's definitely
true. New processors and instruction set architectures are being invented
weekly. Some won't survive the next year or the next round of funding, but many
will find a niche in embedded's ever-changing ecosystem. We need new processors
because we have new embedded systems to use them.
Future embedded processors will be a combination of the unrecognizable and
the all-too-familiar. Unrecognizable because they'll be festooned with
coprocessors and accelerators, intrachip networks, elaborate value speculation,
unfathomable branch prediction, enormous multilevel caches, and inscrutable
instruction sets. All-too-familiar because old embedded processors never die.
Expect to see 8051s, Z80s, and more descendants of the x86 line soldier on in
Multiprocessors and multiple processors per device will be the rule.
Microprocessors themselves (that is, the CPU cores) are already way too small to
fill up a normal chip. Today designers fill the rest of their chips with
peripherals, caches, memory, and more microprocessors. (The average ASIC already
has more than three processors if it has any at all.) There's plenty of room to
make microprocessors fabulously complex without taking up too much silicon.
Svelte silhouettes count for little in this business.
Turning superscalar processors into massively parallel machines won't have
much payoff. Superscalar execution wastes time and power, almost by definition.
Instead, future generations of ¼ber-processors will cooperate, sharing tasks
among hundreds of processors on the same chip. Interprocessor communication and
on-chip networks will take up most of the silicon. The processors themselves
will take a technological back seat to the tiny networks that help them
And what of RISC and CISC? Instruction sets will become increasingly
irrelevant in the sense that we'll never see them. Nobody will program in
assembly language; mnemonics will be like so much Sanskrit. Besides, the "real"
instruction set may not be published, identified, or knowable. Think of
Transmeta's Crusoe processor, with its apparent x86 instruction set but with a
different and undocumented set of internal hardware instructions. When you're
programming in a high-level language, who cares?
The whole concept of instruction sets will be a quaint anachronism anyway.
Processors will likely adapt their features over time, learning and morphing as
they adjust to changing workloads. Mass-produced processors may all ship from
the factory the same way, but they'll likely transmogrify into something
different in the field. Like children graduating from elementary school, they'll
have enough education to get by but their ultimate character will depend on life
At the extreme end of this spectrum will be "soft" virtual processors, with
almost no native instructions at all. Instead, they'll run emulation or
binary-translation code as a kind of ultra-low-level operating system that
enables them to execute software from any legacy processor. Itanium XXXVII,
PowerPC G500, ARM42"it'll all be the same to these chips.
Thinking inside the
There's a great scene in Minority Report in which Tom Cruise's
cereal box plays 30-second commercials while he eats. That's just the kind of
strange-but-true embedded system we'll all be accustomed to in 15 years. They'll
be a combination of the familiar and mundane with the leading-edge and
technological. All rolled up in a package that makes money for someone. I would
expect that cereal box to be touch-sensitive and to track click-throughs, too.
Today the average American household contains around 40 microprocessors (not
counting a few dozen per car and another 5 to 10 in personal computers). Figure
on that number growing to about 4,000, most of them dedicated to entertainment.
Video games, video terminals, multiple levels of wireless networking, media
caching, and always-on access to friends, news, entertainment, and data will
keep our homes humming and millions of MIPS flowing. With terabytes and
petabytes of storage, you'll be able to store every book, every song, and every
movie you've ever seen or ever want to see.
Toys drive technology. Most of the growth in embedded sales, and most of the
advancements in embedded processors, will come from consumer electronics, toys,
games, and entertainment"not computers. That's because computing problems in
everyday life don't get much harder, but virtual reality problems do. We don't
all need to do our own weather prediction or (one hopes) calculate missile
trajectories, so the demand for "computer" performance isn't great. But we're a
long way from perfect, photo realistic, real-time video. That's an area that can
easily soak up an infinite amount of computing power and produce cool products
we'll pay for.
Physical materials will still cost money but processing horsepower will be
essentially free. Speakers, plasma screens, and headphones have a cost but
spatial audio positioning, noise cancellation, and so forth will be easy and
ubiquitous. Wholly synthetic actors will star in computer-generated movies
produced in real time based on market demand or daily events. Many celebrities
will be cybernetic.
With processing power cheaper than mechanical equivalents, even mundane
devices get booted up to the next level. Rear-view mirrors with embedded image
sensors will recognize an impending collision and warn both drivers. Tiny
webcams with wireless connections that cost next to nothing and are no bigger
than a coin will push the boundaries of privacy. 10GHz processors will be so
cheap and ubiquitous that they'll be disposable. We'll throw away the equivalent
of a Cray supercomputer with each week's trash. GPS receivers and wireless
transmitters will be so common that every solid object of any value will know
where it is and can tell you so. Cell phones (and wireless data networks) will
become nonproducts: uninteresting by themselves, but an added feature on another
product, like AM radios today. Even now, researchers can transmit sound waves
through arm and hand bones; to talk on the phone you stick your finger in your
Ancient computers required enormous power supplies. Today's
handheld PDAs run for weeks on battery power while outperforming their primeval
cousins. Power efficiency will continue to rise, and power consumption will
fall. Fifteen years from now it'll almost be possible to run a microprocessor on
the electron energy that decays from its own materials. Piezo-electric power
from shaking or squeezing the chip will be enough to drive most low-end
microprocessors, like a self-winding wristwatch. But our demand for processing
performance will rise faster than power consumption will fall, so such a scheme
won't be widespread. Moore's Law throws us transistors at a 38% compound annual
rate but "Eveready's Law" does not, alas, keep up.
The far side
years is a good long while, but not long enough for some of the stranger
predictions to come into being. Maybe there won't be microprocessors at all, for
example. Dynamically reconfigurable logic has captured the imagination (and
investment capital) of many who see it as an efficient alternative to
microprocessors. Instead of programming a fixed processor with variable
instructions, why not just make the hardware vary over time to fit the
Reconfigurable computing promises chameleon-like hardware that changes itself
on the fly. It's a sexy, alluring, and elegant new technology that has
everything going for it"if you don't count the last few hundred years of human
history. Technical elegance doesn't win market share, and in a world where
computers are dominated by x86 hardware and DOS-derived software, sophistication
seems downright counterproductive. Fate has a perverse streak.
Nor will we see Java processors in the future. Java will be remembered as a
spectacular marketing success overshadowing a minor technical curiosity. It's
fundamentally unsuited to hardware"any hardware"and will gradually be forgotten.
Microsoft's C# knock-off will likewise edge its way toward the abyss. More
likely, some as-yet undiscovered hardware-cum-software language will emerge as
the design and programming tool of choice for new processors.
Inertia and momentum exert their influence on technology, just as they do on
tides and bodies in motion. We don't do what's best for us; we do what's easy.
Legacy software and familiar user interfaces have a way of sticking around in
spite of "better" alternatives. The human animal can accept only so much change.
We adapt slowly to the world that we ourselves are altering. Our fastest bullet
trains run on rails 4 feet 8 1/2 inches apart because that's the width of a
horse-drawn wagon, which in turn followed the ruts left by Roman armies of a
thousand years ago. Backward compatibility colors and flavors a lot of today's
technology, often against all intellectual reason.
Embedded processors make up 98% of all the processors sold. (PCs
fill in most of the rest; workstations are statistically insignificant.) That
ratio will get even more lopsided in 15 years"not because PC volume will
drastically fall, but because embedded volume will drastically rise. Sales of
32-bit embedded processors have doubled and tripled in recent years; we're
witnessing the proverbial "hockey stick" curve of rapid early growth. Much of
that growth is in communications infrastructure.
Our phones, computers, traffic lights, and gas meters now collectively rely
on millions of embedded processors that we never see and care little about.
Closer to home our cars, kitchens, and cable TV rely on hundreds more embedded
processors. A musical greeting card has more computing power than NASA's lunar
lander did in 1969. We wear computers on our clothes in the form of pagers,
e-mail terminals, PDAs, and mobile telephones. Thousands of people have embedded
processors under their skin, as pacemakers or hearing aids.
"Any sufficiently advanced technology is indistinguishable from magic," wrote
Arthur C. Clarke. Today we speak into the air and a person across the globe
hears us perfectly. Everyday we look into a glass mirror and see actors we've
never met on a stage we've never visited. We twitch our thumbs and a
sword-wielding hero battles dragons in an epic tale that has no ending save what
we determine to give it. Magical, indeed, yet trifles in today's world. Magical
indeed will be the treasures that await.
Jim Turley is an independent analyst, columnist, and speaker
specializing in microprocessors and semiconductor intellectual property. He was
past editor of Microprocessor Report and Embedded Processor Watch and has
written several books, including the Essential Guide to Semiconductors
and Advanced 386 Programming Techniques. For a good time, write to email@example.com.