Noted technologist, inventor, and futurist Ray Kurzweil once estimated the processing power of the human brain to be about 10 petaFLOPs — 10 times faster than the first petascale supercomputer that was activated in 2008. Today, many experts believe the processing capacity of the human brain is actually far greater, and some speculate that surpassing the capabilities of the human brain will require the vast processing power of “exascale computing.”
The first exascale supercomputers are expected between 2020 and 2022, prompted through research funded by several governments around the world.

By definition, exascale computing systems deliver at least one exaFLOP of processing power — a thousand-fold increase over a petascale computer, which can deliver at least one quintillion operations per second. However, achieving exascale performance within the next six to eight years is by no means a certainty, as the combined processing power of the world’s top 500 supercomputers — 223 petaFLOP/s, as of June 2013 per the Top500 list — still falls short of a full exaFLOP.
Why do we need powerful supercomputers that run at these speeds? More than just matching the processing power of the human brain, powerful supercomputers enable us to better model and predict climate changes, improve medical modeling for personalized medicine, create new drugs in response to rapidly spreading viruses, boost efficiency in aerodynamics and industrial design, and achieve breakthroughs in nuclear physics for controlled fusion and new forms of clean energy.
To do this, however, the technology race to exascale computing will have to contend with another factor where the human brain reigns supreme: power efficiency. The human brain consumes a mere 20 watts of power in exchange for exascale processing potential. In contrast, the power requirements of operating an exascale supercomputer using today’s technologies could be so massive that it would require its own dedicated power-generation plant.
Is it possible that we will reach exascale computing by year 2022? Yes. Is it possible that we will reach the power efficiency of the human brain by that time? Not a chance.
Some researchers promote a “brute force” processing approach to reaching exascale-level performance, harnessing as many standalone processors together as possible to achieve this momentous computing milestone. This approach may prove to be the fastest way to generate the needed level of processing capacity, but it is not sustainable, cost-effective, or reliable, given the complexity and error rates associated with such a massive machine.
Another approach is to keep pushing the envelope through breakthroughs in power-efficient processing — using existing technologies in creative new ways to produce much higher performance at far lower energy consumption. This explains why the US Department of Energy (DOE), which is funding exascale research in power efficiency and other exascale technologies, has made energy efficiency a priority. The agency has encouraged researchers to strive to achieve strict power consumption targets for exascale-class computing architectures and designs.




If a common human brain has that computing power, why spend trillions and trillions of dollars on developing the Exascale super computers.?
Â
Why not spend a fraction of that money on harnessing the brain power for what scientists want to do?
Like modeling the worl'd's weather pattern, predicting climate chnages and so on?
Â
Here we have a few billion of these self duplicating super computers but we do not know how to use them!
Â
What we need to develop is Surrogate Computing", hiring one's brain for a specific computing purpose . Â And may be, to link up such Surrogate Computers to create a ultra fast computing network!Â
Â
Just a wild idea!