# Is D-Wave a Quantum Computer?

PORTLAND, Ore.—Recently I had to explain to a reader why critics say that D-Wave's so-called quantum computer was not a "real" quantum computer, the answer for which he accepted on my authority. However, the question kept nagging me in the back on my mind "why" D-Wave markets what it calls a quantum computer if it is not for real. To get to the bottom of it, I asked Jeremy Hilton, vice president of processor development of D-Wave Systems, Inc. (Burnaby, British Columbia, Canada) about why critics keep saying its quantum computer is not for real. He also revealed details about D-Wave's next generation quantum computer.

"The Holy Grail of quantum computing to build a 'universal' quantum computer—one that can solve any computational problem—but at a vastly higher speed that today's computers," Hilton told EE Times. "That's the reason some people say we don't have a 'real' quantum computer—because D-Wave's is not a 'universal' computer."

D-Wave's quantum computer, rather, only solves optimization problems, that is ones that can be expressed in a linear equation with lots of variables each with its own weight (the number that is multiplied times each variable). Normally, such linear equations are very difficult to solve for a conventional 'universal' computer, taking lots of iterations to find the optimal set of values for the variables. However, with D-Wave's application-specific quantum computer, such problems can be solved in a single cycle.

"We believe that starting with an application-specific quantum processor is the right way to go—as a stepping stone to the Holy Grail—a universal quantum computer," Hilton told us. "And that's what D-Wave does—we just to optimization problems using qubits."

D-Wave's current quantum processor has 512 qubits, allowing it to solve optimization problems with less than or equal to 512 variables in single machine cycle. To solve qubit-based optimization problems, D-Wave uses a different model for computation than a universal computer, called the adiabatic (occurring without loss or gain of heat) instead of the approach take by everyone working toward a universal quantum computer—the normal gates-based model when qubits are processed in the quantum computer in a manner similar to conventional computers.

"The goal of the adiabatic method is to keep the qubits in their lowest energy state, which is where they are at the beginning and end of a optimization problem," Hilton told us. "When the weights of the variables are input the qubits go into an excited state, but quickly relax into their lowest energy state, thereby revealing the optimal values of the variables."

Those working toward a universal quantum computer today are obsessed with error correction methods—using up to thousands of qubits just to ensure that the superposition of values in a quantum state (part 0 and part 1) is maintained accurately throughout all of its calculations. With the adiabatic method, Hilton claimed, you don't need error correction because the qubits naturally relax into their lowest energy state.

"Our qubits go from excited level to a relaxed level, they don't need error correction at this point," Hilton told us. "But with gate-model of a universal quantum computer you need error correction to get anything to work at all."

**Companies are investing**

"What struck me when I talked to D-Wave is that they are rather modest," Mike Battista, senior manager and analyst of Infrastructure at Info-Tech Research Group. (London, Ontario, Canada) told EE Times. "They are excited about their technology, but don’t over-promise on its potential."

Battista also cited how D-Wave is pioneering more than just quantum computing, but also accumulating experience with new paradigms—like superconductivity—that could keep Moore's Law going.

"Their superconducting semiconductors have advantages even outside of being able to perform quantum computing, such as releasing no heat at all," Battista told us. There is also the potential for the technology to improve exponentially, perhaps being able to carry the next paradigm that continues Moore’s Law when traditional transistors reach their physical limits."

When asked why critics claims its not a "real" quantum computer and they should not be calling it such, Battista had a reasoned answer as to why its going the right direction.

"I know testing of the D-Wave hardware has been mixed, but I understand why large companies are investing in it anyway," Battista told us. "If there is even a small chance that this is the next foundational technology that underlies computing for the next few decades, the investments will be worth it. Companies that get a head start in developing algorithms and finding problems that are amenable to quantum computing will be at a huge advantage if/when viable hardware emerges."

**Next Page:** **Redesigned the architecture**

Rookie

John K Sellers 6/8/2015 4:32:08 PM

The real trouble is that quanitative scaling like that implies a qualitative change as well.

let us make clear what I mean by example. If you have one horse, a saddle will work just fine. But if you have 4 horses, 4 saddles aren't going to help you pull a wagon. You need something else altogether.

So what are you going to do to handle our 512 entangled qubits? This may be too complicated to ever be able to manage. The properties of an entaglement are holistic and do not reside in each qubit, but exist only as a whole. You can't just look at a qubit and check off the state of each of its bits else it would be pretty straight forward to read out an answer. What you will probably (pun intended) have to do since our only tool is statistics is gather enough samples to precisely charactgerize what is going on. And those samples will not be in the conext of a qubit or a particular state, but will be in the context of so many states that it is impossible to count them all much less comprehensively examine them.

So let me ask you a question, how many statistical samples would you have to have to characterize over 10 raised to the 154 power superimposed states? And how easy would this be to do even assuming we could somehow gather such samples when our entanglement colapses every time we look at it?

Personally I don't think we will ever be able to do it.

Author

R_Colin_Johnson 5/18/2015 4:55:44 PM

Author

traneus 5/18/2015 4:37:46 PM

Fifty and more years ago, classical analog computers were discussed more than they are now. We still build analog computers, though we seldom use the term: Every time we use an opamp, we are building an analog computer: The term "operational amplifier" came from the analog-computer world.

Large, fast quantum computers would be useful for certain classes of problems: Those where finding solutions requires exhaustive search of large data spaces, but where checking potential solutions for correctness is fast and easy. One example is Shor's algorithm for rapidly factoring large integers using a large quantum computer.

Present-day quantum computers are useful to graduate students as topics for master's and doctoral theses.

Author

R_Colin_Johnson 5/15/2015 3:27:59 PM

Author

Don Herres 5/15/2015 3:03:28 PM

"D-Wave's current quantum processor has 512 qubits, allowing it to solve optimization problems with less than or equal to 512 variables in single machine cycle." How many bits are in a variable? Does it use a number of processors equal to the number of bits?

From Colin's response on neural networks "Most neural network models use linear equations, where the variables are the constants are the synaptic values and the variables are the inputs from the problem being solved." Was this a typo with variables being used twice?

Author

R_Colin_Johnson 5/15/2015 12:07:32 PM

Author

dt_hayden 5/15/2015 11:39:59 AM

Aftr reading the paper, it strikes me A LOT as the same principle as "neural networks" but implmented in a parallel computing fashion rahter than sequential. The issue I have with my understanding or neural networks is that the system only works if "trained" on all possible data sets. Perhaps the fact that quantum computing can perform each analysis in parallel rather than sequential, this is no big deal.

This is an interesting topic I am looking forward to understanding better.

* http://www.dwavesys.com/sites/default/files/Map%20Coloring%20WP2.pdf

Author

R_Colin_Johnson 5/15/2015 10:58:53 AM

Author

dt_hayden 5/15/2015 10:31:59 AM

So what constitutes a machine cycle?

This is certainly a paradigm shift in thinking. All I can equate it to in my mind is an analog computing process along the lines of "artificial intelligence" or "expert systems" which were fads of the past. Not to say this is a fad.