Steve Furber has been present at the start of a string of changes that have transformed the computer industry. At Acorn Computer, Furber was a designer of the BBC Micro, the PC that introduced a generation of Britain's students to computing. He was the hardware architect of the first ARM processor and went on to direct the University of Manchester's Amulet program, which produced the first asynchronous implementation of a commercial microprocessor. Today Furber continues to explore asynchronous design at Manchester, where he is investigating how the human brain might be analyzed as a large asynchronous system.
EE Times: What led you to your career?
Steve Furber: I read maths at Cambridge, but I was very much on the applied side of mathematics. I was interested in the engineering problems, and especially fluid dynamics, more than the pure mathematics. When I started looking for a graduate program, I was drawn to a professor at Cambridge who was a renowned expert in aeronautical acoustics. He introduced me to the work that Torkel Weis-Fogh had done on hovering insects.
EET: Hovering insects?
Furber: It turns out that there is this small insect so small, in fact, that to his wings the air appears almost as viscous as a liquid. In other words, it has a very low Reynolds number. And Weis-Fogh had observed that this insect depended for his hovering on an aerodynamic interaction between its wings, when the wings get very close together. There is speculation, by the way, that when a flock of pigeons takes off and you hear their wings clapping, they may be using the same effect.
EET: And this led to an interest in computer architecture?
Furber: While I was a research fellow, a group of students formed a club, really, to play with microcomputers. Those were the days when real men built their own computers out of 7400-series logic; only wimps used microprocessors. But I took the wimp approach and built some computers using a microprocessor. Pure amateur interest.
About that time Hermann Hauser he's now the CEO of Amadeus Partners, the venture capital firm was starting a little company that he called Cambridge Processor Unit. OK, CPU. Later on he took the code name of one of our projects, and [the company] became Acorn Computers. Since I had designed some computers and actually used one of them, CPU was interested, and I ended up moonlighting for them.
The BBC at that time was planning a television series on computers, and they thought it would be a marvelous idea if there was a real microcomputer that people could buy and follow along, as it were. They had contracted with a firm to produce one, but they were getting quite displeased with the project, so somebody at the BBC called Hermann and said, "We'd like to come by next week. Can you show us a microcomputer that meets the following specs?"
Well, Hermann called Sophie Wilson actually, she was Roger Wilson at that time who was well-known in Cambridge microcomputer circles. Roger said, "Don't be silly." So then Hermann called me and said, "Roger thinks this can be done in a week. What do you say?" and of course I said, "Well, if Roger says so . . ."
Then Hermann called Roger back and said, "Steve thinks it's no problem. What do you say?" Somehow, we had a machine running Basic when the BBC got there. They were so impressed they shifted their strategy from a Z80 running CPM to a 6502 with a proprietary operating system, and gave us the contract.
Since my research fellowship ended in 1981, I decided to join Acorn. And the first BBC Microcomputer shipped in January 1982. Now up until this point, Acorn had been selling a few machines and kits a disaster, kits, by the way and we thought, "This is nice. The BBC is forecasting they'll need 12,000 units." Nobody expected what happened. It turned out that everybody wanted to get into this computer thing. And then the schools started using them. In the end, Acorn sold a million and a half of them.
EET: So what led from the BBC Micro to the ARM?
Furber: When the money started coming in from the BBC Micro, Hermann had made the decision that chip design was strategic for us. So he'd hired a small team of chip designers and bought some Apollo workstations. And we had been watching the team at Berkeley. I thought, "If a bunch of grad students can design a RISC processor in a year, we can too." There was a certain arrogance in that.
In any case, I mapped Sophie's instruction set onto a three-stage pipeline. It was a paper design, using a two-phase clock, which I modeled in about 300 lines of BBC Basic. When I see people today use 10,000 lines of VHDL to model something of similar complexity, it makes me wonder. . . . And in April '85, there were chips running code. It had been under 18 months from inception to executing code.
EET: So that's how the ARM was born. How did you get from there to asynchronous logic?
Furber: Success was a long time in coming for ARM. Eventually, it came when, as Acorn was failing, Apple Computer came to Hermann and said, "We want to use the ARM, but we don't want to buy it from Acorn. Let's spin out a joint venture."
But before that happened, I'd seen some papers on asynchronous design. The seminal paper was Ivan Sutherland's Turing Award paper in 1989 on micropipelines. That got me to thinking about asynchronous design. About that time, early 1990, a chair opened up at Manchester in computer engineering. As it worked out, we moved to Manchester on the first of August, 1990 just about a month before ARM separated from Acorn.
EET: And that put Manchester on the map for asynch design.
Furber: We began to research not just the circuit techniques, but the tools and methodology that would allow us to implement substantial systems in asynchronous logic. The ARM was a natural target. That was the Amulet project. In 1993 we saw the first silicon on the Amulet-1. It was the first asynchronous implementation of a commercial microprocessor. We'd shown feasibility, but we hadn't shown anyone a reason why you'd want to go to all the trouble. By Amulet-2, however, we were showing some distinct advantages in power, speed and RFI emissions.
That got the interest of an industry partner, a German telecoms company. With them we designed Amulet-3, aimed at DECT basestations and designed specifically for low RFI.
In the process, we had produced the first multimaster asynchronous bus on silicon. We had produced the first synthesis tool for asynchronous circuits. Since then, we have used the tool for a fully synthesized core for a smart card.
EET: And the bus got you interested in interconnect in general?
Furber: As we were working on low-RFI design, we began to realize that using a bus as the central interconnect for a chip is all wrong. We evolved the notion of self-timed networks on-chip instead. This led John Bainbridge from our team to spin out Silistix, a company that is developing the self-timed network-on-chip area further. They are funded by Intel Capital.
EET: With Silistix spun out, what are you working on today?
Furber: We are continuing to develop the synthesis tool, moving it toward higher-performance circuits. On the processor front, we are pushing also on the low-power area. We see a big need for very low-power processors for sensor networks. And we have a project going in on-chip multiprocessing.
Personally, I'm getting very interested in systems of neurons. We are planning the construction of a large-scale hardware neural simulator, as a basis for research into the behavior of very large asynchronous systems that happen to have neurons as their base elements.
EET: Is there enough understanding of how a neuron works?
Furber: Actually, neurons are quite well-understood in detail. But detail is the problem. The equations are hugely complex too much to serve in a system model. We have to find an abstraction that abstracts just the information-processing functions of the cell, and ignores the other things, like feeding, self-repair and so on.
We have some reasonably well-developed architectural ideas, but there are still some open questions.
EET: Will this be a multidisciplinary project, then?
Furber: Yes. We are actively pursuing engagements with neuroscientists. The big challenge is that it will probably take two or three years to learn to talk with them the language is so different from systems architecture. Also, we are reaching out to mathematicians. Personally, I believe that we will understand the function of the brain to be a result of the dynamics of large systems, rather than something special about the individual components or the way they are wired. That is a field mathematicians have already explored. And we will need to work with psychologists, who have a very important black-box view.
EET: So the underlying challenge is how the brain works?
Furber: We may have to solve that scientific grand challenge before engineering in large dynamic systems can go forward. I'd be surprised if we get any solid answers in the next 10 years, actually. But I'll be very disappointed if we don't have some good answers within the next 20.
My view today is that the neuron is basically a pattern recognizer. It is the scale of all these individual elements working at once that matters.
Look at the scale. There's a sea slug that has about seven neurons. We actually have quite good models of a sea slug that predict its behavior accurately. A honeybee has, I think, around 850,000 neurons, and it already has very complex behaviors, including complex communications with its peers. Then you have humans. But the neurons are all very similar.
In fact, there is work that suggests that in humans, the neuron interconnection patterns are very similar in two parts of the brain the front and back of the cortex. So maybe at the neural level the functions don't change between basic processing of sensory data and higher thought; it's just the level of abstraction represented by the signals that is changing. This leads me to believe that we can apply the right abstractions and understand the complexity. But then computer scientists are very enamored of abstractions; we tend to see them everywhere.
EET: Is there a link between asynchronous CPUs and systems of neurons?
Furber: This is all about the theory of complex dynamic systems. Mathematicians have the tools to deal with that. And mathematicians with whom I've spoken appear interested. We just need to make their tools useful to engineers.