FAYETTEVILLE, Ark. Because error correction will generate more energy than chips can dissipate, quantum computers may be further off than previously believed, according to University of Arkansas physics professor Julio Gea-Banacloche.
"I don't have a theory yet," said Gea-Banacloche. "Rather, my work is based on the experimental results of others. But at a minimum my results indicate that for very large-scale quantum computations, one really needs to use systems with longer decoherence times."
Decoherence intervals measure how long a quantum bit a qubit can maintain synchronized waveforms that represent both "1" and "0" simultaneously. Since quantum states are "fixed" only when they are observed, calculations made on them while they are nebulous are, in effect, performed in parallel on all their superimposed values. Unfortunately, most semiconductor designs on the drawing board have decoherence times of less than a microsecond too short by a factor of 1,000, Gea-Banacloche said.
"If you have decoherence times of over a millisecond, then there is no problem at all, but if the system you are working on has an intrinsic decoherence time of less than a microsecond, then it is more or less hopeless that you will ever be able scale up that system," he said.
On Gea-Banacloche's graph of decoherence times vs. energy consumed, quantum computers based on technologies with decoherence times of less than a microsecond most of today's semiconductor designs are off his scale. Even if researchers can lengthen the decoherence interval to 10 microseconds, their chips will still consume more than 100 megawatts.
Calculations are based on decoherence times vs. energy consumed by a theoretical quantum chip cracking the AES 1,024-bit encryption code. Since the bottom of the scale represents 1-second decoherence times that consume only 10 milliwatts and tops out at 10 microseconds, the theoretical chip would need 100 megawatts to crack the code. The minimum requirement for quantum technologies for chips, Gea-Banacloche said, is a 1 millisecond decoherence time. By his calculations that would consume about 1 watt.
Unfortunately, the very thing that makes quantum systems useful their ability to superimpose values makes them even more prone to errors than classical systems. The nebulous state of qubits can be destroyed by a variety of factors, all of which boil down to an inadvertent coupling to the environment, resulting in decoherence of the superimposed values.
In all quantum-mechanical systems, the important principle has been to preserve "coherence" that is, to make sure that the qubits remain "unobserved" or otherwise undisturbed by other factors in the environment. Coherent states are fragile and easily destroyed by random events. Observation of the state of a qubit is one such source of decoherence; once that operation is performed, the system reverts to a normal digital system and the advantage of quantum computations is lost.
To solve the error problem, quantum error correction methods were proposed in 1995 and demonstrated in 1998. Since then, many groups have refined quantum error-correction encoding techniques which basically replicate a nebulous qubit's value onto separate physical systems that enable observers to "compare" qubits after a calculation, without "observing" their nebulous values.
"In practice, error correction is what a quantum computer will be doing most of the time, because of the short decoherence times we have today," said Gea-Banacloche. "You don't even have to be calculating; if you just store qubits, they still spontaneously decohere through various mechanisms that couple them to the environment in various ways. Thermal excitation, spontaneous emission, random ambient magnetic fields or anything like that is going to destroy the coherence of a qubit."
The need for this quantum error correction led Gea-Banacloche to begin his calculations on how much is too much in other words, When does error correction reach the point of diminishing returns? In his theoretical framework, that question resolved into the length of a system's decoherence time because the shorter the decoherence time the more inadvertent qubit errors will need to be corrected.
"If you are lucky enough to have a long decoherence time, then you can do with much less energy, because you have to make corrections much less often," he said. "Once you pick a system and measure its decoherence time, then you will try to make it longer by any means necessary. For instance, in semiconductors that have been used for building quantum circuits, lower temperatures result in less thermal excitation, which turned out to be a preliminary condition for any calculations using quantum states."
To test his hypothesis, Gea-Banacloche examined the inner workings for all current quantum mechanical systems proposed for computation. After a detailed analysis of each system, he deduced that the amount of energy that any such system requires to ensure that its results will be corrected for errors is inversely proportional to decoherence time. After quantifying this result for specific energy requirements vs. specific decoherence times, he concluded that hardly any of the currently proposed quantum computer chip designs can be scaled up, because the energy required to do error correction is off his scale.
"For scientific study, even a very simple quantum computer can be a very useful thing, but the real challenge is this: For the very large-scale integration you need to solve problems of 'strategic importance' to use military language for factoring a 1,000-bit number [to crack encryption codes] you need a decoherence time of at least a millisecond," said Gea-Banacloche.
Gea-Banacloche said two approaches to quantum computation already show promise of increasing the time to decoherence to more than a millisecond. The first uses electron spin as its quantum state, as do most approaches to quantum chip making, but extends the spin's coherence time by confining it to superconducting loops called Squids, for superconducting quantum interference devices.
"Josephson junction qubits may be an answer here," he said, "because superconducting currents are one of the very very few microscopic quantum phenomena that can actually persist for a relatively long time. So theoretically, superconducting may help, but the problem today is that in real experiments they have so far only been able to achieve decoherence times on the order of a microsecond."
So while some electron spin systems may one day hit a millisecond decoherence time, some atomic systems may beat them out by starting with longer-lived phenomena in the first place.
The only other variable, besides decoherence times, that affects Gea-Banacloche's calculations is the algorithm used for quantum error correction. He said there is a chance that improved error correction codes will shorten the "1-millisecond minimum" by decreasing the number of steps required to perform quantum error corrections.
"The second area for improvements is for engineers and computer scientists to find better error-correcting codes," said Gea-Banacloche. "There are reports of new error correction methodologies that can lead to orders of magnitude fewer operations. These could also improve the prospects for quantum computers with intrinsically short decoherence times."