I must be too easily impressed, because this is really impressive. A milestone even. From here, it's just a few short years before Watson's kids are answering their own questions recursively from being loaded up with all manner of highly specialized, obscure, & complex research material in medical, materials science and physics, and spitting out unified field theory and transparent aluminum formulas. WOW.
IBM has all sorts of plans to expand Watson's reach, using the lessons they learned in founding the Open Advancement of Question Answering (OAQA)? systems initiative. By following OAQA principles, it should be possible to house nearly any knowledge domain in a searchable format, enabling natural language queries for all sorts of applications. The only major limitation is the computing horsepower required--several seconds of supercomputer time per query--which will restrict users to those that can afford it.
R_Colin, earlier attempts to crack a close information system using AI failed to consider the complexity of human communication (with cues and noise source in the form of modulation, accent, etc).
Were the logistics described to you, I did not get the details form this piece of that competition.
Is the question prefetched in the system by human? Speech or vision recognition?
I wonder if a system like this would be able to capture effectively all the relevant details for an effective query.
For example, in the medical case, only if the device is able to capture the essence of all relevant data, can I see that the diagnosis to be effective.
My thought is the GIGO principle can limit the effectiveness of this effort in real commercial applications.