PORTLAND, Ore.—IBM will harness water-cooled 3-D chip sets to plumb the secrets of the universe by analyzing exabytes (billion gigabytes) of data streaming in from the world's largest radio telescope to be constructed in 2024.
The five-year, $42.5 million Dome exascale project will analyze data streams focused by a array of small dishes comprising a square kilometer (.6-by-.6 miles) worth of radio telescope area spread across an area 1,864 miles wide. As a result, the gigantic radio telescope will listen in on the faintest signals from the deepest parts of space where the oldest events occurred—notably the Big Bang from which the known universe originated. Along the way, it will also uncover the secrets of the mysterious dark matter that comprises 23 percent of the weight of the universe, but is invisible to conventional telescopes.
The gigantic Dome computing task being taken on by IBM's Center for Exascale Technology (Zurich) in cooperation with Astron—the Netherlands Institute for Radio Astronomy (Drenthe, the Netherlands)—will have a 12 year lead to construct the water-cooled three-dimensional (3-D) microchip technology necessary for a computing platform equivalent to millions of the world's fastest supercomputers running in parallel.
Besides 3-D computing cores, the Dome project will also develop optical data transport mechanisms based on nanophotonics and novel phase-change memory storage units. To keep the intense heat from the processors from melting down the computers, IBM will develop novel new cooling technologies that pump water through chips.
"We will leverage 3-D chip sets and water cooling to in order to have a very efficient way of processing the exabytes of data streaming in," said Martin Schmatz, a research scientist at IBM Zurich. "For storage we will also leverage novel new components like phase-change memories."
To realize the dream, IBM and Astron have also partnered with universities in Australia and South Africa to develop smarter analytics that are capable of automated machine learning that filters out noise and constructs ultra-detailed sky maps.
Having an ambitious stretch objective enables the design of a new architecture that can address a massive data processing project. It will be interesting what spin-offs emerge. Once done, data mining all of accumulated human knowledge would seem to be a modest challenge in comparison.
Actually I believe the current standard model is Inflation Model, a variation of Big Bang Model. There are experimentally testable predictions made by these models, such cosmic microwave background radiation. So people are following scientific methods. I do know what other scientific methods you are referred to.
Anyway, what is wrong with spending money on investigate "Something happened a long time ago and we don't really know what it is was".
I am pretty certain @daleste that IBM is seeing potential to make big money on this technology eventually...otherwise they will simply not pursue it...water-cooled 3D technology is one of the few technologies that can deliver exabyte computing as the power dissipation is the most limiting factor...Kris
"...notably the Big Bang from which the known universe originated."
Question: Why is it that theories have become commonly accepted as something more than theories? What ever happened to the scientific method? Or is "Big Bang" a euphamism for 'Something happened a long time ago and we don't really know what it is was. But we need to keep the research dollars flowing, so we're going to call it by this cool alliterative name and hope that nobody notices.'?
Other than that, go Big Blue!
3D chips are the future of computing. This gives a good overview: http://www.jilp.org/vol9/v9paper9.pdf Within a few years, we will have the computing power of today's desktop in a package the size of an Aruduino, with what was spread out in 2 dimensions stacked up in 3.
I believe that the water-cooled 3D chips IBM develops to process exabytes of data daily will have myriad commercial applications as the Internet-of-Things begins streaming sensor data from every corner of the Earth up to cloud computers. The more analytics that can be performed by network edge-devices with these 3D chips, the less congestion there will be on the Internet as a result.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.