PORTLAND, Ore.—The stunning victory of IBM's Watson cluster computer over human champions Ken Jennings and Brad Rutter on the quiz show Jeopardy proved that artificial intelligence algorithms make the system capable of being an expert's advisor, but not in becoming an expert itself.
IBM is currently adapting Watson DeepQA architecture to commercial applications, where it will act as an advisor to human experts, thus defusing criticisms that it is not an expert itself (after Watson asserted during the Jeopardy game that Toronto was a U.S. city). By filtering Watson's advise through the expertise of a human—to eliminate such obvious mistakes—IBM hopes to apply Watson to a variety of fields, including healthcare, financial services, government-mandate management and retailing.
On Wednesday (Feb. 16), Watson wrapped up the victory in a three-day match up, finishing with $77,147, well ahead of Jennings ($24,000) and Rutter ($21,600). For winning the Jeopardy IBM Challenge, Watson pulled down a cool $1 million. IBM said it planned to donate the money to two charities-- World Vision and Worldwide Community Grid.
For medical applications, IBM plans for Watson-like systems that act as an expert consultant for doctors. By comparing a patient's symptoms to those of millions of other patients worldwide, along with the treatments that cured them and all the latest information from medical journals, Watson will supply a ranked list of possible diagnoses. Of course, if the computer makes obvious mistakes on such a list, the doctor will delete these before communicating its diagnostic advice to patients.
On Thursday, IBM and Nuance Communications Inc. announced a research agreement to explore, develop and commercialize the Watson computing system's advanced analytics capabilities in the healthcare industry.
Watson (center) beat the humans champions, Ken Jennings and Brad Rutter, during the game show Jeopardy.
In the financial services area, Watson will provide real-time analytics to financial institution experts, allowing them to what-if analysis on market data, current events, the opinion of analysts and a thousand other unstructured information sources that are difficult to combine using in to conventional algorithms. Here the human experts will interpret recommendations, as with medical applications, thus shielding users from a computer's mistakes.
Likewise, for clarifying the dizzying array of laws and regulations that take human experts weeks or months to sort out, Watson will be able to cut that analysis time down to hours or even minutes, with any mistakes corrected by the human experts supervising its work, according to IBM.
In retail settings, IBM will use Watson to consider past buying patterns, inventory, order management, supply chain and other customer relationship management issues that target individual buyers. Here, mistaken recommendations will not count against Watson, since people are used to "irrational" buying patterns due to issues such as brand loyalty, which transcends rational analysis, according to principal scientist Aditya Vailaya at Retrevo Inc. (Sunnyvale, Calif.).
"I expected Watson to win," said Vailaya. "And watching how Watson won validated my feelings that these problems are solvable, but not without a few caveats."
Retrevo runs multiple parallel algorithms similar to the learning methods that Watson used to answer Jeopardy clues, but Retrevo's domain is consumer products. Using a cluster of 50 Intel processors, Retrevo scans the product specifications, reviews, user reports, blog entries and other web-based information sources to answer user's questions about what products offer the best overall value. Watson will also face the same challenges that Retrevo when it is applied to retailing, according to Vailaya.
"The challenge for Watson was that it took them four or more years to build the system to be as good as it is," said Vailaya. "Most of the algorithms Watson uses are available for broad general cases, but they have to be trained for specific applications."
The issue is not computational horsepower, according to Vailaya, but algorithm refinement—and that takes time to develop. Retrevo, for instance, constantly evaluates over 50 million data sources regarding the 20,000 products it provides advice about on its website. But it has taken six years since the company was founded in 2005 to "be as good as it is"—a learning curve that Watson's spinoffs will have to go through for each application to which IBM's DeepQA architecture is applied, according to Vailaya.
IBM spent 10s if not 100s of millions developing Watson, but they did it with much more foresight than their previous Deep Blue computer that beat chess grandmaster Gary Kasparov. By crafting it using standard servers and a plug-in database, IBM can now plug-in other domains and quickly get Watson-like applications running in commercial fields.
yesterday, i watched this achievement getting coverage on Jon Stewart's Daily show. I did not know before that what IBM's Watson was all about. It took IBM 6 years to develop this, i wonder how much money did they invest? And of course its services can (in principle) be applied in all the thinkable consultancy services.