While the rest of the world may be focusing on miniaturization, Harry Porter is headed in the opposite direction. A professor in computer sciences, Porter has a strong interest in showing just exactly how these systems work. To that end, he's built a very impressive relay computer, which occupies a place (make that a big place) of honor on his living room wall.
Harry proudly displaying his relay computer.
Porter's relay computer consists of four physical units: arithmetic logic, register, program control, and a sequencer unit. Each is housed in a nice wooden frame with a glass front for display. Everything is organized logically and LEDs are in place so you can actually see the data flow through the system while the 415 total relays emit a familiar and satisfying cacophony. You can see and hear it in the video below.
The specs, taken from Harry's documentation, are as follows:
Data Bus (8 bits)
Address Bus (16 bits)
All relays are identical (Four-Pole-Double-Throw, 12 Volts)
Max Power Consumption: Estimated 12 Amps @ 13.5 Volts (160 Watts)
Porter completed the computer in 2007. When I asked him if it still worked, he replied "Yes, it is still functional. But it doesn't get much use. I tend to read email on my iPad instead," which is completely understandable. It now waits on standby, ready to be powered on if an excuse presents itself.
Registers and switches.
Click the image above to view the entire slideshow of 17 images.
Programming is quite an arduous task that involves first selecting an address, then using switches to select the byte you wish to enter and then push that into the memory. Repeat this process over and over until you're ready to begin your computation. There are a few sample programs documented on Porter's site that demonstrate how the machine does simple addition, subtraction, and multiplication.
Harry has done a fantastic job of documenting the build. He shares not only a wonderful set of pictures of the process, but also PowerPoint presentations, schematics, and a 60-minute detailed breakdown of the relay computer's design.
A long time ago I had a pre-PC keyboard, I'm thinking it was from Burroughs, that had fairly mushy (no snap-action) keys. They actually built in a small solenoid in it to give you a nice loud click for audible feedback, and a little vibration for tactile feedback, on each keypress.
A couple of years ago I was talking to a Russian (now working for an Israeli company) who had worked on a pneumatic computer used to control a Russian nuclear power station. It was the size of a supercomputer, and was all built with pneumatic logic gates and ran off air pumped in by a huge set of fans which were so noisy that the operators had to wear ear protection.
Given the low reliability of the hardware the whole machine used redundant and error-correcting logic (3-way majority voting on *everything* IIRC, gate-level and module-level) such that failed modules could be hot-swapped *while the control program was running* with no effect.
Why do this? Bacause nothing is more rad-hard if something goes horribly wrong than a computer which doesn't use electronics, only AC power...
Doubt it was exactly the same course, but I think in that era most computers were shared with punched card input. It was lots of fun. I'd just discovered biorhythms (remember them?) and as an (unofficial) exercise wrote a Fortran program to print them out. Obviously doing graphs on text-based printouts wasn't that easy, but I got it working pretty well. I was tweaking something and did the (for me) inevitable - forgot to close a loop. When I went to get my printout (which was usually about 3 or 4 sheets) I had half a ream of fan fold 132-column paper waiting for me with my biorhythms for the next 50 years...and a VERY stern talking to from the head operator (especially as it wasn't set classwork). If nothing else it taught me to check my loops VERY carefully in future.
Me, too, David. But just barely. I used them in a statistics class at UCSD back in '72. I remember we had to wait to do our class work at night because the serious engineering students had claim to the big computer during the day. My relatively simple statistics problem took about an hour to run. Today, we could all probably get a lot more done sharing a smartphone. ;-)
Speaking of punch cards....some San Francisco trivia; I once heard the four Embarcadero Center office buildings were designed to look like punch cards standing on end. I think of that every day as my ferry pulls into the embarcadero.
Yep, that's a drum card. Wikipedia calls them "program cards", but we never called them that because usually your whole deck of cards was a program. Its main use was for setting tab stops, so you'd have one card for Fortran and another card for assembly language (tab stops for opcode, operands, and comments). But it could also automatically skip columns, duplicate a column from the previous card, and automatically shift to numeric so you wouldn't have to hold down the numeric key when entering data.
If you were entering fixed-format data such as " .word 12345,67012,45670, ..." you could have the drum card automatically copy the ".word" and commas, skip to the correct columns, and shift to numeric so all you had to key in was the numeric values. This was a very fast way to enter a lot of numbers, and I've yet to see a screen-based editor with similar capability.
@Betajet...Drum Cards...was reading more about these... seems they would enable you to put your statement numbers, contents etc in the right columns, and also limit what you could type in them......am I right? The machine I used was used by a number of different users for different purposes and I don't think had a drum card used....but I could be wrong. I can see it would be really useful if you were only entering (eg) Fortran statements.
I did a FORTRAN course once and you had to punch your cards out on the machine, then take them down to the computer centre (you learned to use 2 elastic bands the first time they fell apart :-) and they'd run it and you'd collect your printed output later. Along with the wrath of the operators if you had an endless loop that chewed up valuable computer time.....
I take it you are talking about something like this:
Personally, I am very grateful that I'm old enough to have come of age at the beginning of the microcomputer revolution and enjoyed the exciting "barnstorming" days before the "suits" took over. Now it's all DRM and patent infringement lawsuits. But there's still plenty of fun to be had in embedded design where you can still program at the bare metal, and in open source hardware and software where there's so little money to be made that the patent extortion entities find better targets.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.