IBM ushered in what it called the "era of cognitive computing" yesterday at the Cognitive Systems Colloquium (CSC) held at IBM Research (Yorktown Heights, N.Y.).
At the event, IBM unveiled its newly minted Cognitive Systems Institute, a collaborative effort between universities, research institutes, and IBM clients to advance the state-of-the-art in cognitive computing, starting with four major universities: Carnegie Mellon University (CMU), the Massachusetts Institute of Technology (MIT), New York University (NYU), and Rensselaer Polytechnic Institute (RPI).
As part of this new era of cognitive computing, we wanted to make an announcement today about our Cognitive Systems Institute, which will involve four key universities," said Jim Spohrer, director of global university programs at IBM Research (Almaden, Calif.) in an exclusive interview with EETimes.
IBM's long history in building computers that can mimic the cognitive functions of humans began with its Deep Blue platform, which beat then reigning world chess champion Garry Kasparov in 1997. More recently, IBM's Watson cluster supercomputer beat the human champions -- Brad Rutter and Ken Jennings -- on the television quiz show Jeopardy in 2011.
"After IBM's Watson won on the game show Jeopardy, many universities contacted us saying they would love to work with us on this new era of cognitive computing, and today we are glad to announce we are scaling up our activities in this area," said Spohrer.
IBM has already applied its Watson cognitive computer to applications in healthcare and financial services, where it combines deep database searchers and intelligent pattern matching algorithms that provide realtime advise to human experts. Now IBM aims to generalize its cognitive computer capabilities with collaborative efforts between academic, industry and government research centers whose joint goal is to create cognitive computers that use natural language and brain-emulating algorithms to augment human intelligence in all areas of endeavor.
"Something big is happening in cognitive computing. It’s much bigger than Watson. We're here today because this is bigger than us. It's bigger than the IBM company," said John Kelly III, director of IBM Research at CSC. "The first eras of computing were about automating human tasks. This era is fundamentally different. This era will be about scaling and magnifying human capability. The separation between man and machine will blur. The synergy between the two will shine through."
IBM already works with thousands of universities worldwide, but its new Cognitive Systems Institute will enlist the help of particular universities to develop specific capabilities needed to realize a smarter more user-friendly type of cognitive computer. The four universities initially joining the Institute will receive funding this year to be followed next year by shared university research awards to include Power architecture servers running a Watson open-source software stack.
The MIT team, led by professor Thomas Malone, will concentrate on developing what it calls socio-technical tools and applications that boost the performance of groups of workers engaged in collaborative tasks, such as decision making. By more closely connecting people and computers the MIT effort will aim for combined man-machine performance that is more intelligent than any person, group of computer can achieve alone.
"As the world becomes more interconnected through the use of communications technology, it may become useful to view all the people and computers as part of a single global brain," said Malone at CSC. "It's possible that the survival of our species will depend on combining human and machine intelligence to make choices that are not just smart but are also wise."
The RPI team, led by professor Selmer Bringsjord, will explore artificial intelligence techniques that take advantage of recent IBM advances in processing power, data availability, and "smart" algorithms including "semantic" data tools.
The CMU team, led by professor Eric Nyberg, will concentrate on the rapid construction, optimization, and real-time adaptation of large collections of analytic components, such as personalized information agents that directly interact with users.
The NYU team, led by Paul Horn, senior vice provost for research at New York University, will develop automated pattern recognition algorithms that reflect how deep learning using neural networks can impact science.
Our view is that these new cognitive systems will accelerate progress immensely. Up until now we have been using cognitive shovels, but these new tools will be like cognitive bulldozers, enabling us to do a lot more in terms of decision support systems that augment human performance. And from the global university perspective they will also have profound implications regarding the ways we teach. Just as the calculator changed how students did math problems, cognitive computers will transform higher education.
IBM's Watson cluster supercomputer beat the human champions on the television quiz show Jeopardy.
In addition to expanding the capabilities of IBM's Watson, several other cognitive computing initiatives also fall under the umbrella of the Cognitive Systems Institute, including IBM's attempts to build computer chips modeled on the human brain -- the Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) project sponsored by the Defense Advanced Research Project Agency (DARPA).
Yes, there is a new Cognitive Systems Institute website for sharing information, you can search for it.
About 300 faculty researcher worldwide (in the areas of artificial intelligence, cognitive science, neuroscience, and other relevant disciplines) with connections to IBM Researchers and IBM Watson Practitioners have been asking how to take the collaboration to the next level.
We are also exploring ideas as part of the Cognitive Systems Institute Linkedin Discussion Group, again you can search for it.
Contact me: Jim Spohrer, Director IBM Global University Programs and Cognitive Systems Institute, if you have trouble finding it. I like to these from my bio service science website.
The "Watson" software (and hardware) capability of ingesting information and then providing meaningful answers to questions is an exciting development. The reduction in size of the hardware by orders of magnitude since the Jeopardy win suggests that it is also becoming commercially viable. I understand that already there is a project involving the use of "Watson" technology to help doctors select appropriate cancer therapies for patients. This is a welcome development and I hope that the patients will benefit while the operational lessons learned will expand the business opportunities for IBM. With luck (and innovative management), the Cognitive Systems Institute should help advance the frontiers of this artifical intelligence towards more practical uses.
Dr. Spohrer (Yale PhD in AI) used to be the IBM evangelist promoting the new field of 'Services Science', reflecting the economic structure of developed economies. Now IBM seems to be layering on BIG data and AI.
Having dabbled in Knowledge Engineering in grad school in the '80s, and having concentrated on search (such as IR was in those days w/o the benefit of much more than the WELL to index), it seems to me that what you will find at the bottom of the vast open pit when all the data has been mined, before you reach Kubricks' enigmatic cuboid, is Tim Berners-Lee waving a Semantic Web manifesto. How could all that analysis NOT reveal regularities demanding standardized representations essential to automation? Folksonomies need not apply.
@ Junko.Yoshida "Educate us on the fundamental issues that Cognitive Systems need to solve."
Thanks for a great question, Junko. No time line was mentioned, but here is my take on the more pressing problems that need to be solved: IBM already has a good grip on how to do deep searches into unstructured Big Data for specific domains--that's how they beat the human champions in Jeapordy--the NBC game show hosted by Merv Griffin. IBM has also been able to successfully repurpose those algorithms for medical diagnosis and financial planning (with other domains in the works) with what it calls its IBM's DeepQA architecture--a 24 man-year effort to create a Practical Intelligent Question Answering Technology--PIQUANT--which is turn is based on the Open Advancement of Question Answering (OAQA) systems initiative--an open-source effort to make question-answering algorithms reusable across applications. However, the two areas that need the most work right now are the man-machine interface (on Jeapordy the questions were actually supplied to Watson in text form) and the database selection problem. PIQUANT and OAQA work well in restricted domains--and that's the way it will probably stay for a while--queries restricted to specific problem areas. However, the ultimate goal is to interpret the user's natural language queries, then select a proper domain to make the deep Big Data dive for answers. Even more difficult will be carrying on a meaningful conversation with a user when the domains might shift from topic to topic. This kind of unrestricted trolling for meaning, which people do so naturally when, say, Googling this and that, before the light-bulb turns on in their head, is still a long way off, hence prompting IBM to form its Cognitive Systems Institute which will attempt to augment mans meandering mind with the computational horsepower to scour Big Data for answers--even when the questions have not yet been clearly formed.
I agree @C Davis...I think this opp is huge and much larger compared to what Apple, Amazon or even Google are contemplating...and it will permanently change our society...I was trying to be modest with a trillion dollar estimate, as you say it could be easily 20x that...Kris
It is easily a $20 trillion world market out of $70 trillion on the chopping block. The majority of the service industry can be automated (healthcare, education, law, engineering (why not writing software too?)). This is 75% of the US $15 trillion economy. The rest of the world is 2-3x of that.
My guess this will ve much more disruptive to the labor markets than any other technology revolution (agricultural, industrial, computer,..). Due to the speed of which it can be implemented, breadth of the economy it impacts and total numbers of workers employed, we will have to rethink how we deploy and employ people productively in a very short time. Many winners and loosers will be made in the process. Maybe I should buy some IBM stock now.. Luddites will come out of the wood work. They are already showing up at MIT ( http://www.technologyreview.com/featuredstory/515926/how-technology-is-destroying-jobs/ ).
@ krisi "I am not sure whether I like the vision of taking to computers"
I was not a fan of speech recognition either, until I started using Apple's Siri, which is quite good at dictation, but still lacking in natural language understanding (I have had to learn the exact phrasing needed to perform certain often used tasks--like finding the local time when its a specific time in another time zone). That's one dimenision that cognitive computing will add--the ability to understand your queries without having to phrase them in particular ways. But the other side of cognitive computing is the ability to augment human capabilities, especially when troving through Big Data. For instance, medical diagnostics is already profiting from applying IBM's Watson technology to doing intelligent searchers through millions of unstructured records--from physician reports to journal articles--matching symptoms to diagnoses that even specialists might otherwise miss.
You have admire IBM for fantastic forward thinking...in few years, cognitive expert systems will slowly start taking over the entire service industry (banks, doctors, customer service, etc)...this is a trillion dollar target...I am not sure whether I like the vision of taking to computers but it looks they will provide better service than human beings...Kris
January 2016 Cartoon Caption ContestBob's punishment for missing his deadline was to be tied to his chair tantalizingly close to a disconnected cable, with one hand superglued to his desk and another to his chin, while the pages from his wall calendar were slowly torn away.122 comments