Yes, it's odd that we don't have a moon base by now. Who'd a thunk, back in 1969, that almost half a century later, we'd still not have been anywhere but the short moon missions?
Flying cars probably never made any sense. Still, it's pretty hard to make accurate predictions, because people have a way of predicting only, at best, improvements to things that exist in their world today. So for instance, if we could fly, even before the days of heavier-than-air craft, then we could dream of "flying to the stars." Seems like a natural progression.
On the other hand, the side-effects of the information revolution, a revolution required to make any technological advancement, turned out to be more important to the average joe than the original purpose (mostly military and space). Hard to have predicted that one!
Going to the moon is incredibly expensive and what is the payoff? Personally I think the payoff is that we can push forward to begin to build ships outside of earth's orbit, but unfortunately the country we live in doesn't hold that as a priority. I believe there are still a few other moderately healthy space programs out there though.
Pardon the nitpick, but just to keep a whole generation of young readers from getting it wrong: It's "core storage," not "cold storage."
The phase is a reference to the little rings or "cores" that stored a "0" by magnetizing in one direction (e.g. clockwise) and a "1" in the other. And yes, a ring is not really a core, so to nitpick on history also, it really should have been called "ring storage."
The phrase mostly died away in the late 1970s and early 1980s as seminductor memory took over, then resurfaced in a most peculiar fashion as users of the then-new PCs started to refer to the non-magnetic storage of their PCs as "core storage" in the sense of "the fast memory at the center or core of my PC."
That too died away after a few years, replaced largely by the acronym RAM, which is actually a bit too generic since it technically includes any directly accessible memory, including, well, core.
Seeing "core" mutate now into "cold" is interesting. But in terms of evolutionary selection this particular mutation leaves me, er, cold. It really does have a nice ring to it, but in the wrong direction, since it implies storage (e.g. archival) that isn't used much. For the "main memory" (by which I mean the fastest and most directly accessible) of a computing device, "hot" might be a lot more apt, though. It is after all the storage that changes most quickly, and for that matter is quite literally most heat-generating form of memory (which pales in magnitude next to processing heat, of course).
So my counter offer: Let's create a spectrum! Hot memory, represented by blue, is the fastest moving, fastest changing, most heat-producing memory, with on-chip registers being right at the edge of ultraviolet. (Engineers, welders, laser specialists, and astronomers all know that blue means hot and red means cool, despite those silly faucets labels that have confused me my entire life.)
Cold memory, represented by red, is the slowest, least often changed, and least often accessed form of memory, what we often call archival storage.
And of course, hot memory is costly and must be used sparingly, while cold memory is much cheaper and far more plentiful. The goal of a well-designed cache architecture is to make all memory look nice and warm, even if the vast majority of it is in reality cold almost to the point of being frozen solid.
I have an 8K core memory board hanging on my office wall, once belonging to a DEC something or another, just below a couple of (very old) Marconi Wireless Telegraph Company stock certificates. There was a story floating around Control Data's early days about its core memory and how the tiny magnetic donuts were hand threaded in Hong Kong because the process needed the fine and delicate hands of Asian women. The location was affectionately known as "Bill Norris' Far East Core House"!
Etmax & Mariski. On the shelf in my office I have a non-destructive readout ferrite core memory in its test jig. It is the normal donut with a second even smaller hole in the donut. Wiring that is real needle work.
My first experience with memory drums was the Sperry drums in a stock control computer. The read heads had to be adjusted to within one thousands of an inch of the oxide surface, the standard gauge to do this was a roll-your-own cigarette paper. The skill was to avoid crashing the ferrite heads into the drum surface. One day the local corner shop ventured to equire why I regularly purchased cigarette papers and never any tobacco-perhaps suspicious that I was obtaining my "smoking material" what ever that was from other sources. Those drums were about 18 inches high and about two feet in diameter, if my memory serves me, 20 or 30 tracks of 5K bits.
Ron Neale: One day the local corner shop ventured to equire why I regularly purchased cigarette papers and never any tobacco-perhaps suspicious that I was obtaining my "smoking material" what ever that was from other sources.
Two rolling papers memories:
Back about 1970, a store in Worcester, MA was selling tobacco and pre-rolled papers and a machine to inser tobacco and a filter. There was much concern when they noticed that all of the college students seemed to be buying the papers and the rolling machine and not the tobacco or filters.
Fast forward about 25 or so years and my daughter was learning to play the flute. The guy who re-padded her flute said she should carring a package of cigarette papers to put under the pads on the valves to absorb the moisture from playing. How would one explain that these days? Or buy the papers? They used to be in all the stores that had a tobacco section. And no, I would not loan her the insert from my Cheech & Chong "Big Bambu" album.
BTW, while in some ways its kind of self-evident, I never mentioned why the inventors of core memory called them "core memories" instead of "ring memories."
It's because the idea evolved out of hysteresis effects in transformers, specifically in the iron or ferrite "cores" around which transformer coils were wound. It is in that context that the real reason for using the term "core" pops out, since the "core" is the lump of material at the very center of the windings.
But transformer cores don't work well if they are just blocky lumps, so very early in transformer history they started shaping them into rings. Below is a nice Wikipedia diagram of a simple ring-shaped transformer core. The figure makes it apparent that core memories were in a very real sense just huge arrays of tiny, hysteresis-rich, misbehaving-by-design transformer cores:
(Tangent, or if you prefer, sinx/cosx: There is a marvelous picture in the old Life-Time books of people at the peak of core memory miniaturization using magnifying viewers to thread near-microscopic cores. I don't know if the picture is online anywhere, but I'll may still have a copy of that book. I'll look.)
ETmax: one of the ferrrite core based computers the FCC at ICT to which I made minor design contributions had delay lines as well as ferrite data and program stores; as a young lad I was seconded to its developement team. (aside max the magnificant will remember that one I'm sure). Below the stacks of ferrite cores was a box with connections to the core matrices. As a newbie nobody wanted to tell me what was in the box as it was a secret. So one night when working alone I unscrewed the lid for a look see. Very disappointed all that was inside were lots of coils of coax. I later discovered that this was a series of delay lines of different length driven from a single pulse to provide the sequence of pulses to scan the x-y drive for the core matrix. I think one of the engineers had worked on the UK nuclear weapons program and adapted the constant length delay lines used to trigger the RDX to provide a perfectly timed sequence of pulses. More detail will be found in my book "Computers that Didn't" to be followed by the sequence "Memories that Forgot" when I finally get round to writing them.
I think one of the things that could be a downside to the abundance of available memory is the incredibly bloated code that we now use. I just downloade an update for my IM software, seriously, 51MB for an update?!! (BTW, my first computer had no memory, just two disk drives, one for booting and one for storing)
OK, I can beat you all. My first computer, a Sinclair ZX81 (used to be called Timex in the states) had just 1K of RAM. I later got a Sinclair Spectrum with 16K. Programs were stored on audio tape cassettes (using FSK). You used a TV as the display. They used BASIC and it's amazing what you could do with that small memory. You could also write and run Z80 machine code programs. They were lots of fun. Data (and programs) expand to fill the space available they say, that's certainly true today as Janine says.
I was too young to afford a PC in the paper tape, punch card, or cassette days, but I can remember what a pain in the butt trying to use my grampa's trash-80 cassette.
Of course, real old school PC is the MITS Altair -- got to have all those lights and big switches!
But for crazy, its hard to beat my brother, who designed and built a 8080 system, hand translated assembly code into binary, and programmed it to EPROM using DIP switches. Since he's a pack rate, he probably still has that system somewhere! For some reason, it didn't get much use...
@TonyTib But for crazy, its hard to beat my brother, who designed and built a 8080 system, hand translated assembly code into binary, and programmed it to EPROM using DIP switches.
My first job as an intern was to wire up the EPROM programmer board for an 8080 system. After several weeks of toggling switches for the bootloader and hand translating assembly into binary to run tests, the first thing that got programmed into EPROM was the bootloader and EPROM programming code.
8080 has a very simple, pretty regular machine language so hand-translating between ASM and octal bytes isn't hard. I had a Heathkit H-8, which included a marvelous single-page 8080 instruction table arranged in octal order. After a while, you can do most of the translations from memory. The PDP-8 and PDP-11 are even easier because of better regularity. In constrast, ARM instruction bits are all over the place in the later versions. Hand-assembling ARM would easily beat out your brother in the crazy department.
OK, I can beat you all. My first computer, a Sinclair ZX81 (used to be called Timex in the states) had just 1K of RAM.
I'll see your Sinclair and raise you by an Elf. I built my first computer based around the design for the RCA 1802 Cosmac Elf. It had 256 bytes of CMOS memory and that consisted of 2 chips! When I developed our first product I used a 1K x8 bipolar PROM that I turned on and off to conserve power, and after partioning the software functionality paged in ~200 bytes ar a time (needed scratchpad,stack etc in the rest).
You've got that right @betajet. And it applies to documents too. For example, there seems to major penalties for formatting codes (or Apple to Microsoft, I'm not sure which). I tried to save a .pages file as a .doc file and it increased in size by 20X. Bloat, bloat, bloat.
@Max, elsewhere I wrote at length on the many issues that come with bloated SW, such as load time, cache hits, execution time of crucial functions within one tick and several others. Modern compiler writers just don't get it. I wrote a program in C and compiled it on MS C V5 for DOS back in 1990 and it was 25k. In 2008 I compiled it with the lates MS C++ compiler (still for a CLI) and it was almost 300k!! On the subject of memory etc. I have a databook for a 1 bit processor that can be programmed with a string of diodes and resistors :-)
There are similar (in inspiration) modern kits like this one; I plan on eventually getting one for my son, since it shows boolean logic in ways an Arduino can't.
My first programming wasn't on a PC, either: it was on my HP-34C programmable calculator. Then I got the renowned HP-41C, which I still have although I might eBay it after fixing the flaky 0 key (they fetch good prices on eBay).
I still like having a calculator (or two or three) around, despite the computers, and a couple years ago picked up a HP-48G+ because I wanted a RPN model that would show a four level stack. Yup, you can get calculator emulators for smart phones, but touching virtual buttons isn't the same.
I still have a ST225 20M hard drive kicking around some where, along with the even bigger 80M version. DiskCon (or what's left of it) typically has a historical exhibition of HDDs, including the original IBM drive, and some even bigger (physically) ones made by competitors.
My 1986 Intel databook has lots of great stuff like bubble memory, the iAPX432, and intelligent text display controllers. And although I haven't seen core memory, I know about from my IBM 1620 manual (the 1620 worked in BCD).
Supposedly some military crypto systems used cone memory, but a quick search doesn't turn up anything on that. Of course, on the analog side, the Navy used to use some great technologies like 20 psi proportional air control systems and mag amps (magnetic amplifiers).
My favorite historical memory device is the Univac FASTRAND moving head magnetic drum unit, which stored data on the surface of a heavy drum approx 6 feet long by 1 foot in diameter. The Devil's DP Dictionary defined it as "a device for storing angular momentum". The gyroscopic effect of the drum was so strong they had to add a counter-rotating drum so that the 4,500-pound unit wouldn't turn as the Earth rotated.
No one seems to have mentioned magnetic card memory. It was popular on early HP programmable desktop calculators (circa 1969-1973 - I still have two kinds somewhere ...). I encountered a reference to another kind in my first job, at a bank. Their pre-IBM 370 series computer (I think it was an NCR) had a mass-storage device that used magnetic strips with a series of binary-coided notches in the top of each card. They were suspended on rods and the rods were rotated so the correct card fell down into the reader (usually!). A programmer told me they often had mis-drops or jammed cards. Cheaper than a drum, I guess.
Back about 1972 or 1973, the newsletter of the Boston College data center was called "Hard Core." When they replaced their IBM 360 Model 40 with an IBM 370/145, there was a several-issue discussion on whether they needed to change the name of the newsletter since there was no longer any such thing as hard core. I never found out if they changed or not.
IBM once had a head-per-track hard drive that was used to emulate memory when semiconductor memory was too expensive and conventional hard drives were too slow for paging. Later, as the cost of semiconductor memory fell, a memory manufacturer made a plug-and-play replacement for the head-per-track disk using solid-state memory. Memory emulating a disk that emulated memory!
Stargazer:-I think The Burroghs Corporation had a ledger card that had a magnetic strip along one side. Burroughs also invented virtual memory that I think used a drum, but it was not until another company promoted it that it really took off. I think the other company was IBM.
@Ron Neale - re Mag Stripe Ledger Cards - they did. My dad used to work for Burroughs (in sales, unfortunately, not in tech) but I remember those cards. I think the machines that used them were the L4000 series but it's a long time back, I was just a kid and I'm probably wrong.
Upon reflection I'm sure it was NCR because it was at a bank. They still used the NCR forms for planning and documentation by systems analysts and programmers. A quick look at Wikipedia found an article on the NCR CRAM - Card Random Access Memory: http://en.wikipedia.org/wiki/NCR_CRAM. The article documents the mis-drops that the older programmers told me about. There's a link at the end of the article to a PDF of the NCR Product Brochure, so you can see how large a monster this was. You can also see the notches on the cards and the shape of the rods, leading to the binary selection of a card.
I'm sure many of the BUNCH (Burroughs, UNIVAC, NCR (National Cash Register), CDC (Control Data Corp), and Honeywell) had similar products. The BUNCH suceeded IBM and the Seven Dwarves (NCR, Burroughs, Control Data Corporation, General Electric, Honeywell, RCA and UNIVAC; SDS (Scientific Data Systems) was also active at that time).
In 1959-1960 the company that I worked for had a contract to manufacture the Arithmetic Unit of a vacuum tube based computer that was being built at the University of Oklahoma. It was a copy of a computer being built at Rice University. The memory used Barrier Grid CRT storage tubes. The computer had a capacity of 32768 words of 56 bit length.
"A Brief History of the Rice Computer 1959-1971" is very interesting read about the hardware and architecture of the computer.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.