MOSCOW -- Seth Lloyd, an MIT professor and self-proclaimed "quantum mechanic" who claims to have invented technology used in the world's first commercial quantum computer, from D-Wave Systems Inc., is crafting what he calls the first quantum application -- Quantum Machine Learning -- which he described here at the International Conference on Quantum Computing (ICQT).
"What can we use quantum computers for?" Lloyd said in his ICQT session. "I would like to propose the first quantum app, or q-app, which I call 'Quantum Machine Learning for Big Quantum Data.' "
In a nutshell, Lloyd's q-app (pronounced "quapp") encodes Google-like queries with q-bits that enable quantum computers to not only perform real-time searches through even the most gigantic databases, but which also insures their absolute privacy, since attempts to eavesdrop on the query by the search engine provider would disturb the delicate q-bit's superposition of states.
Lloyd has tried to get commercial funding to develop his q-app, but has so far failed to convince any venture capitalist to fund his project. The reason, he claims, is that his q-app insures that the search engine would not be able to store the user's queries to add to the reams of information they already store about each of their users.
"The VCs say that the search-engine business model is to learn everything they can about their users, so my q-app goes against the very core of their business," said Lloyd.
To prove that it will work, and hopefully attract some brave VC to fund its development, Lloyd is revealing the details of how his q-app makes real-time searches through the biggest conceivable databases. And after losing control of the technology that he claims to have invented, which he says D-Wave is currently using without paying him royalties, this time he has patented his Quantum Machine Learning q-app.
D-Wave's commerical quantum computer has already been installed at both Lockheed and the Quantum Artificial Intelligence Lab created jointly by NASA, Google, and Universities Space Research Association.
The technique he proposes uses the same types of machine learning algorithms that have already been developed for dealing with big-data, but encodes their queries with q-bits which can only be processed on quantum computers. He claims the technique is so powerful that it could search for results in real-time through the biggest possible data sets. For instance, if each person on the earth's genome was sequenced and encoded on a vector, then an array of these vectors could represent the entire combined genome of every person on the earth, which would be about 10 to the 20th bits (1,000,000,000,000,000,000,000). However, Lloyd claims that even a gigantic database could be queried in real-time using his q-app running on a future quantum computer with only a 70 q-bit processor.
Massachusetts Institute of Technology professor Seth Lloyd proposes a quantum machine learning algorithm that shifts through any sized data based in real-time.
Today, it is impossible to perform even the simplest linear manipulation on such large data sets, which would take more operations than all the computers in the world have every performed since the very first computer was invented. But quantum computers are perfectly suited to processing this type of data -- big arrays of big vectors.
The key to Lloyd's q-app is that it would use quantum mechanics to map these big arrays of big vectors onto a tensor product space that results in an exponential compression of the data set.
"We get a kind of quantum compression of the classical bits into much smaller q-bit space, then use conventional machine learning algorithms, but on the much smaller q-bit space," said Lloyd. "Many of the most popular machine learning techniques would work in this much smaller quantum space."
Lloyd has tested his algorithm -- in theory and on a small scale -- using a conventional supervised machine learning algorithm operating on his q-bit space that shrinks classical data sets exponentially. And he claims the process worked so well that even the largest data set of all -- every bit in the universe, which he describes in his book Programming the Universe, would only require a quantum computer with 300 q-bits to query in real-time.