Most computers at that time were limited by the ability to access
memory, said Wilson. There were 8-MHz clocked processors available such
as 68000 and NS16032 but they took four clock ticks to run a memory
cycle, whereas the 6502, in a rather RISC-like fashion, took just one.
Wilson, Furber and the rest of the team had already started to evolve
from being systems designers, relying on packaged parts, to becoming
chip designers. Ferranti ULAs [uncommitted logic arrays – an early form
of FPGA] were being designed to perform as video controllers and
cassette memory controllers and this had also taught an appreciation of
keeping ICs from running hot.
By 1982-1983 the 16-bit era was
arriving and Acorn was looking for its next processor. Charlie Sporck,
long-time CEO of National Semiconductor, visited Acorn in Cambridge to
try and persuade them to use the NS16032, later renamed the 32016 to
emphasize its 32-bit internals.
"We were on about revision H [of
the chip] and there were so many bugs. We'd been out to Israel to see
their labs. It was a CISC. They had 100s of engineers and they were
producing errors," recalls Wilson.
However, later Wilson also went out to Arizona
to visit Western Design Center, the developers of the 65C02 then
working on a 16-bit follow on design. "We drove out and found a few
experienced engineers working in domestic bungalows on the 65C816. We
left that place convinced we could design a processor," said Wilson.
by this time reduced instruction set computing (RISC) was in the air.
Research projects on RISC principles at Berkeley and Stanford had been
running since 1981 and the IBM 801 project had also shown how RISC
principles could be applied to high performance.
In October 1983
Acorn opted to design its own 32-bit addressing processor. Wilson
defined the instruction set and Furber worked on the micro-architecture
supported by a small but talented team of engineers.
As had been
the case in Acorn's previous designs, it was not low power per se so
much as simplicity and elegance that drove the design of the Acorn RISC
Machine. It was fabbed for Acorn in 3-micron CMOS by VLSI Technology
Inc. VLSI produced the first ARM silicon on 26 April 1985 and it worked
the first time.
One design goal was to have low-latency
input/output handling like the 6502. That low latency would stand ARM in
good stead for embedded applications later on.
"We knew we wanted to put it in a cheap plastic package. We ended up
with a design consuming 100-mW that could run off the power of the I/O
diodes." That is the charge built up in the ESD-protection I/O diodes
could continue to run the processor for some time even after Vcc power
had been removed.
The ARM1 had about 25,000 transistors and the
follow on ARM2 had about 27,000 transistors. The original aim of an
ARM-based computer was achieved in 1987 with the launch of the Acorn
It would be easy to say that the rest is [more recent] history.
Acorn was not destined to enjoy success in its own name. While its
computers continued to enjoy support for a while in the U.K. education
market the IBM personal computer revolution – powered initially by the 8088 and then the 80286 Intel processors and Microsoft's MSDOS operating system – was a global
phenomenon driving almost all before it.
In the late 1980s there
were three attempts to spin the processor development business out from
Acorn, Wilson said. Eventually in 1990 that was achieved with Apple
Computer, and VLSI Technology Inc. backing Advanced Risc Machines Ltd.
as a joint venture.
Wilson was not a founder of ARM preferring
to stay with Acorn and eventually with DSL chip company Element 14 Ltd.
which spun off from Acorn and was sold to Broadcom Corp. in October 2000
for about $600 million. "We did so much more than just the ARM at Acorn
including the ARM250 which was the first [ARM-based] system-on-chip
design and the ARM7000 and ARM7500FE."
Sophie Wilson did not join
ARM but did provide consultancy back to the company. Wilson worked on
the ARM7 and the ARM7TDMI and was a consultant on all the ARM processors
up to ARM11, but not on the Cortex range.
The ARM processor was
low complexity because it had to be easy to design with limited
resources, and that made it low power. Its small size made it well
suited to the system-chip revolution of the 1990s and it was no accident
that if found early success in the mobile phone.
Wilson concludes: "Hermann Hauser says he gave us the things Intel could never give us, no resources, no time and no money."
Click on image to enlarge.
Acorn RISC Machine fabricated for Acorn by VLSI Technology
Inc. in 1985
and abbreviated to ARM.
Shrug---it's more complicated than that, my dear Atlas.
If you squint at published benchmarks like Specmark just right, the more complex ARM models gets comparable or better performance per GHz. The real reason why x86 architecture runs your favourite applications is, as you pointed out, because they are not available for ARM, because Wintel.
This is changing slowly: there are reports that PC sales are crashing, and the Wintel shiny front wall starts showing cracks. Will SolidWorks be available and usable on Android any time soon? Probably not, but the reason is not 'more transistors' or better architecture on x86.
What an odd reply!
I think the point is that this is story about people, and at some point in the timeline the main character in the story had a sex-change, which is a remarkable thing, making it a glaring omission from the story.
Nobody gives rat's ass what you used registers for on any CPU. ARMH is successful because customers can design targeted SOCs in 1/4 the time it take Intel to provide a reference design for what they define as the next mobile CPU one year too late.
Most processors at the time of the 68000 were CISC or multiple clocks per instruction. It wasn't until later when power and size for embedded applications were the drivers that brought RISC (ARM) type processor to the forefront.
This is more a *very* brief history of the people responsible for the design of the ARM architecture than it is a history of the conception of the design itself. I walk away from reading this article wanting to know way more about the design philosophy and design choices that were made regarding how the architecture came together. To me, the instruction set and the programmer's model, and the thought process going into their design, constitute much of what I would consider the "shaping" of the architecture. That merited one short paragraph.
That the ARM architecture had to be simple, compact, fast, and have low power consumption is a little obvious. That the framers of the ARM architecture have their roots in the 6502 is little more than interesting trivia.