PORTLAND, Ore.—The last major engineering hurdle to quantum computers—millisecond coherence times—has been surmounted by researchers at IBM Research, making commercialization of the technology possible "within our lifetimes," according to Matthias Steffen, manager of IBM's Experimental Quantum Computing group.
Steffen and colleagues at T.J. Watson Research Center described their three breakthroughs Tuesday (Feb. 28) at the annual meeting of the American Physical Society (APS) in Boston.
"Given where we are now with coherence times, our engineers are now turning to the remaining engineering challenges that still need to be addressed before commercialization," said Steffen. "In particular, we need to be very careful about how we design the microwave interfacing to our quantum chips."
The three breakthroughs described by IBM include nearly .1 millisecond (95 microseconds) coherence time for a q-bit isolated from its environment inside a 3-D copper waveguide cavity. The second demonstration was of a nearly identical q-bit, but mounted on a 2-D planar substrate, which was able to achieve a 10 microsecond coherence time. And the third breakthrough was demonstration of a 95-to-98 percent success rate for a two q-bit logical operation called a controlled-NOT. The significance here is that a C-NOT gate, together with single q-bit gates, can be configured to perform any quantum computation (in a manner similar to how the NAND gate can be configured to perform any classical computation.)
The basic q-bit repository demonstrated by IBM consisted of a super-cooled Josephson junction consisting of two superconducting electrodes separated by an insulator. A super-cooled capacitor connected the two superconducting electrodes in order to lower the frequency of its operation into a regime that standard measurement equipment can handle today—upwards of 20 GHz—necessitating the use of microwave-caliber test electronics.
The construction of the q-bit memories and gates were all performed with micro-fabrication techniques already in common usage for standard silicon chips, making IBM optimistic that it will be able to scale its system architecture up to thousands or even millions of q-bits per chip. As a result, calculations that were once considered impossible to perform can now at least be envisioned.
IBM's solid-state structure stores a single superconducting quantum q-bit, the building block of future quantum computers.
Next, IBM is aiming to engineer working quantum computer components that include inherent error detection and correction on-the-fly. Since q-bits can represent whole arrays of binary values simultaneously, in what is called a superposition of states, it is crucial that premature de-coherence does not destroy these delicate states during a calculation. Error correction schemes for quantum computers have been proposed, but all required coherence times measured in milliseconds, qualifying IBM's .1 millisecond demonstration as a breakthrough.
Quantum computers are desirable because they promise to outperform even supercomputers by being able to create un-crackable encryption codes, perform optimization across variable sets impossible to span today, as well as to search and sort vast databases with simultaneous parallel operations in a fraction of the time required by a conventional computer that must inspect each database entry separately.
This is great news. But I heard that quantum computers are so fast that they can easily crack the security code. Is it true ? How can we make our future communication secure if quantum computers are commercialized.
Writing software for such as beast would be almost impossible. In fact it would be counter productive to have the software ride on the Qunatum computers -- it will drag their effective speed down to halt!
Supercooling experimental devices is often done just to simplify the experiements, with the final production units optimized for running at room temperature. However, in this case the superconducting JJs are dependent on the supercooling, so they may end up like mag-lev trains--requiring cyrogenics even in normal use.
Hey, I'm old enough to be used to that (6502, Z-80, PDP-11, M68K, etc.) unlike programmers today who only have to contend with the latest x86 incarnations. It's amazing how software has actually been able to make progress when CPU architectures don't get radical makeovers every six months or so.
One thing I have not heard is whether or not anyone is working on the software for this beast. Speaking as someone who has written a few lines of code, I have no idea how to program for it. I am pretty sure that the OS is going to be more involved than just doing a Linux port, and I suspect that applications are also going to have to be written differently. Is anyone working on this, or are you hardware guys just going to pitch it over the fence when you are done? :-)
I read it as they had to super cool the apparatus to slow it down enough to measure it with today's equipment. This leads one to question whether or not they can maintain the millisecond coherence time 'at speed' - when the apparatus is not super cooled?
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.