Editor's note - Chuck is responding to a question I posed
about 8-bit MCUs in a recent
edition of the Focus on MCU newsletter: "What are your thoughts on
8-bit versus 16-bit versus 32-bit MCUs?"
Why, yes, 8 bits of embedded micro can be quite a bit of horsepower,
I've been either in or attached to this business (microprocessors,
micro-controllers) since almost literally after it's inception at Intel
with the 4004. My first computer system design utilized an 8008 back in
the early to mid 70's. I soon graduated into the 6502 family and
dwelled there for a number of years developing many peripheral products
utilizing the 6500 architecture. I went on to deploy Z80's, and the
8088. I then dove into the single-chip micro business, incorporating my
first PIC - waaaaay back in the General Instrument days, when the part
was still P-Mos based! I used it for my design of Mouse Systems optical
mouse in the early-mid 80's, and boy did we sell a LOT of those (at
least a couple million even back then).
I eventually found myself developing hardware for and programming the
IBM PC, which frankly still felt like an 8-bitter to me with nicely
extended register sizes.
I forayed into the 16 bit world very briefly, developing code for an
embedded HC16 in a KLA-Tencor image computer system, but always found
it awkward to use, since I kept trying to look at it as an 8 bit
machine (which was all I really needed - I didn't pick the processor,
being just the contract firmware engineer on the project).
I KNOW 32 bit machines like the back of my hand, after working many
years at Intel, validating the heck out of all its 32 bit CPU
architectures during my tenure there. Here's how I feel about the IA32
engine: Turning back toward the days of the Apple ][, we see countless
millions of units sold, a highly successful string of machines built on
8 bits of architecture with a measly 1-2 MHz of horsepower. There was
excellent software available in the form of VisiCalc, highly
entertaining games, various word-processors, database engines, etc.
Throw in the Z80 co-processor cards, and you had Multiplan, SuperCalc,
Word Star, d-Base 2, Word, and a host of other highly impressive and
very usable business-class products on these 8-bit platforms. With the
exception of hardware accelerators, such as multiply and divide and
address register extensions, what else was useful about 16 and now 32
bit processors for the mainstream markets when it came to single-user
apps? Sure, today we have amazing 3D-like gaming, and gigabyte
addressing, and multiple apps running simultaneously. But is this
progress, or is this more of the same tied up with a prettier bow?
But now I find myself once again independent and developing a product
around an 8-bit micro-controller solution. Only this time it's Atmel
based, and boy am I impressed. We clock away at 20 MHz, and are able to
read an external serial FLASH device at 10 Mbit/sec to feed a real-time
interpretive engine. Now I'm clocking 20 times as fast as I used to due
back in my Apple days, and with a RISC architecture that delivers
nearly 1 MIPS/MHZ, taking only a couple of mA's to do it. Because of
the simple 16 bit register movement extension, any word-based data is
nicely and easily accessed and manipulated when occasionally required.
There are lots of real-world, embedded solutions that barely have the
need of data types beyond bit and byte. With the best targeted C
compilers churning out such efficient code these days, it really
doesn't matter that we're only using 8 bits, because we're using 8 bits
efficiently and cleanly now, and 16 and 32 bit machines would cost us
more on several levels and deliver no real added benefit.
Frankly, many of us do just fine with 8 bits, and there really isn't
any reason to migrate upward in device architecture. I think one reason
behind the lagging of 16 bit sales is 8 bitters can clock fast enough
and have enough 16 bit extension that a true 16 bit machine isn't
needed. And 32 bits? Why? All of my data structures are bit, byte and
occasionally word-oriented. My Atmel architecture screams, but you can
find 8-bitters in the 100-200 MHz range if you need them. That's plenty
of horsepower for the real-world in all but the most insanely critical
embedded solutions. Specialty applications need the bit-width and the
high speed, sure, so let them have it.
And the cost is so dang low! I can buy 8 bit solutions in SOT-23 style
packaging in volume for quarters and dimes! What do you think has a
higher manufacturing volume: the laser printer or the toothbrush? With
8 bitters costing so little, we can now have the $5 smart electric
toothbrush if we want. I seriously doubt that 32 bit machines will ever
become cheap enough to displace the venerable 8 bit architecture. It's
a matter of scale and application. The 8 bit machine is so well suited
for the majority of applications, and can be made so small, and so
cheap, that we're hitting up against the real-world limits of cost
based on packaging alone.
What's the point of applying 32 bits where 8 more than suffices? It's
doubtful a 32 bit machine could ever be cost-competitive over an 8 bit.
It's not just a volume issue - as we've mentioned. It's total packaged
cost. And I just don't see 32 bit engines in transistor-sized SMT
packages. There are so many 8 bit package options and peripheral mixes
and memory sizes and clock rates available from just a handful of
manufactures, it's nothing short of bewildering. Suffice it to say that
just about every imaginable need is well covered these days.
It may make sense to consolidate multiple 8 bit machines into a single
32 bit solution when it comes to the automotive market. Perhaps it's a
cost consolidation that is overdue. But there will still be 8 bit
micros in those same autos, I'm sure. The siren song of more horsepower
in bit width has lured the embedded engineer for decades, yet the 8 bit
micro still reigns supreme.
I've heard the predictions by industry pundits and marketing
departments alike for probably 30 or more years now claiming the death
of the venerable 8 bit machine. Now the newest data from at least one
source says 2010 is the cross-over year. Yeah, sure. Talk to me next
Great letter Chuck.
Some manufacturers are making it really easy to pick the best of both worlds for the application at hand. We've recently switched from an ARM based micro to the Freescale Flexis where you pick the core - 8 or 32 bits. The peripheral set and footprints are identical for the HC08 or the Coldfire V1 cores! Do you want horsepower or low current consumption? Check the right box on the compiler and away you go. Want to change cores, check the other box and recompile, it's that easy.
For some applications we found that using the 32 bit ARM and throttling the clock back to reduce power consumption increased the latency for critical routines. These routines are much better handled by an 8 bit processor at a higher clock, and at a lower current consumption.
I agree, 8 bitters will be around for a long time.
I guess, that those (traditional) embedded applications, which have intensive I/O with physical sensors and actuators, together with relative simple control algorithms and entry-level networking - are perfectly implemented with 8-bit microcontrollers. Such kind of applications don't need many on-chip resources, except variable peripheral controllers.
But, those applications, which demand DSP ,large memory, sophisticated networking - are more convenient to implement with 32-bit microcontroller.
It is well known, that the major part of a silicon real estate is occupied by the memory. For my understanding, cost of manufacturing of 32-bit microcontroller vs 8-bit microcontroller is almost equal - for the same set of peripherals, memory size, package, etc.
The really big difference between 8-bit and 32-bit architectures is the software development tools. For 32-bit architectures, the tools are much more powerful, flexible, and convenient, than for 8-bit architectures. It greatly influences cost of the software development. I mean those applications, which could be implemented with any core architecture.
For a great many of the jobs that I take on an 8 bit micro might well be capable but I still use a 32 bit part - why - because my customers get better value if I use a more expensive part but deliver the solution quicker, with lower development costs. In low to medium volumes the cost of the processor core is a very minor part of the total project cost.
Annother factor which is pushing the balance in favour of 32 bitters is the increasing demand for ethernet and TCP/IP support in even quite small systems. While you can just about squeeze this into an 8 bit processor your choice of 32 bit (mainly ARM based) parts with on chip ehternet support is greater and the performance is MUCH better.
When you compare an 8 bit and a 32 bit device with 64k of RAM, 256k of flash, CAN, USB and ethernet all on chip you find that the price difference is rather small because the peripherals cost much more than the core.
This pushes up the volume at which the total cost of ownership crosses over.
Thanks for your kind words and great feedback. I agree with you: the smart engineer knows how to pick the right tool for the right job, and uses a bit-width scaled to the job's requirements. It's been my experience that 8 bits is more than enough for the lion's share of embedded applications I've worked on, from computer mice to industrial control to platform stress testing.
you make a lot of excellent points. We still do extensive work with 8 bit microcontrollers and love the Atmel AVR range. We see a solid forward market for 8 bit because of the ease of use, power management and cost benefits.
We also work with 16 bit and 32 bit microcontrollers and these also have their place. It is about picking the best tool for the job.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.