Meng He and Andrew Siska look at the evolution of embedded processing, and explore the implications of programmable system on chip architectures to its future direction..
Nowadays, it is not unusual to see the terms microprocessor and microcontroller used interchangeably. However, the evolution from MCP to MCU, which started as more than as three decades ago, still goes on.
From the simplest integration of programming and RAM memory, to data interface and discrete component (ADC, Op-Amp, etc) integration, the difference continues to extend to programmable system on chip architectures.
Phase 1: First Usable Microprocessor Introduced
The evolution began in April of 1972 when Intel introduced the 8080 microprocessor. The 8080, running at 2MHz, executed up to 500,000 instructions per second. The 8080 required three power supplies: -5V, +5V, and +12V as well as an external 2-phase clock source.
Shortly after Intel introduced the 8080 to design engineers, Motorola introduced the 6800, followed by MOS Technology’s 6502 and Zilog’s Z80 (which had an 8080-compatible instruction set). Intel responded to the 8080’s competitors by evolving the 8080 into the 8085. Two phase clocks and multiple power supply rails were replaced by a crystal, or oscillator, and a single +5V supply. Instruction cycle times were decreasing while instruction sets and processor throughput were increasing.
Intel eventually spun the 8085 (8-bit) into the 8086 (16-bit), which was used by IBM to develop the first PC and the race was on. As demand for increased speed grew, Intel responded with the 80286, followed by the 80386, then the 80486, and so on. Motorola and AMD quickly followed Intel with introduction of their own 16-, 32-, and 64-bit microprocessors.
Phase 2: Enter the Microcontroller
As the microprocessor industry evolved, prices for 8-bit devices fell. Hardware designers began incorporating them into their designs to replace discrete digital logic. But even as prices dropped, designers of high-volume, cost-sensitive systems continued to design using discrete logic. A microprocessor required external RAM for data storage, external ROM for program memory, and other peripheral devices to make them useful. In many cases, the cost of the memory and peripheral devices exceeded that of a discrete design.
Given the size of the “embedded” marketplace, Intel and other microprocessor manufacturers began integrating small amounts of RAM, ROM, and a few peripheral components onto a single device. Thus the microcontroller was born!
Intel introduced the 8048 microcontroller with (depending on the variant) 64-256 bytes of internal ram and internal or external ROM. The 8049 had 2k bytes of masked ROM (the 8748 and 8749 had EPROM) which could be replaced by 4k bytes of external ROM, 128 bytes of RAM, 27 I/O lines, and an internal 8-bit timer/ counter. A variant of Intel’s 8048, the 8042, was used in the original IBM PC keyboard.
In 1980, Intel introduced the 8051 microcontroller. The 8051 provided a CPU, 128 bytes of RAM, up to 4k byte of ROM, four byte wide bi-directional I/O ports, interrupt logic, two or three timers (8 and 16 bit), bit manipulation instructions, and one or two USARTs in a single package. It also had a 16-bit external address space allowing up to 64k bytes of RAM and ROM.
The original 8051 required 12 clock cycles per machine cycles. With a 12MHz crystal or oscillator the 8051 machine cycle ran at 1MHz. Most 8051 instructions would execute in one or two machine cycles.
Given its popularity, Intel’s 8051 has been copied in one form or another by over 20 manufacturers over the years. Some variations run with 100MHz clocks and four, two, and one clocks per machine cycle.
Device offerings from various silicon manufacturers include enhancements such as multiple 16-bit data pointers and a host of peripheral devices such as I2C, SPI, CAN and LIN, USB, PWM, comparators, A/D converters… the list goes on.
Over the years, microcontrollers of various bus sizes (8-bit through 64-bit), pin counts, speed, RAM/ROM, and internal peripherals have been developed by numerous silicon companies such as Cypress, Atmel, Microchip, Renesas, and Motorola. System designers can now choose a device to fit their design rather than choose a design path to fit a device.
Phase 3: Enter Analog
As more and more digital peripherals were being integrated into various microcontrollers, device vendors began integrating commonly used analog components such as comparators and op-amps. Thus began another cycle in the evolution of the microcontroller – programmable analog coexisting in the same device package as its digital relatives.
This current evolutionary step represents the state of art within the microcontroller industry. The difference in complexity between MPUs and MCUs may seem similar, but the high level of integration results in quite a different internal system.
Now MCUs allow designers to build a complete system with a single chip: analog/digital input, analog/digital processing, and analog/digital output, fully configurable, any pin to any peripheral, analog running fully independent of the digital system or both analog and digital peripherals blissfully coexisting – a true system on a chip.
Meng He graduated from Marquette University with Master of Science degree in Electrical Engineering and has been working at Cypress Semiconductor as a product manager since 2007. You can contact Meng at firstname.lastname@example.org.
Andrew Siska has been a circuit designer for the past 30 years. He holds a BSEE and MBA and is currently working for Cypress Semiconductor as Senior Staff Application Engineer.