Advertisement
News
EEtimes
News the global electronics community can trust
eetimes.com
power electronics news
The trusted news source for power-conscious design engineers
powerelectronicsnews.com
EPSNews
News for Electronics Purchasing and the Supply Chain
epsnews.com
elektroda
The can't-miss forum engineers and hobbyists
elektroda.pl
eetimes eu
News, technologies, and trends in the electronics industry
eetimes.eu
Products
Electronics Products
Product news that empowers design decisions
electronicproducts.com
Datasheets.com
Design engineer' search engine for electronic components
datasheets.com
eem
The electronic components resource for engineers and purchasers
eem.com
Design
embedded.com
The design site for hardware software, and firmware engineers
embedded.com
Elector Schematics
Where makers and hobbyists share projects
electroschematics.com
edn Network
The design site for electronics engineers and engineering managers
edn.com
electronic tutorials
The learning center for future and novice engineers
electronics-tutorials.ws
TechOnline
The educational resource for the global engineering community
techonline.com
Tools
eeweb.com
Where electronics engineers discover the latest toolsThe design site for hardware software, and firmware engineers
eeweb.com
Part Sim
Circuit simulation made easy
partsim.com
schematics.com
Brings you all the tools to tackle projects big and small - combining real-world components with online collaboration
schematics.com
PCB Web
Hardware design made easy
pcbweb.com
schematics.io
A free online environment where users can create, edit, and share electrical schematics, or convert between popular file formats like Eagle, Altium, and OrCAD.
schematics.io
Product Advisor
Find the IoT board you’ve been searching for using this interactive solution space to help you visualize the product selection process and showcase important trade-off decisions.
transim.com/iot
Transim Engage
Transform your product pages with embeddable schematic, simulation, and 3D content modules while providing interactive user experiences for your customers.
transim.com/Products/Engage
About
AspenCore
A worldwide innovation hub servicing component manufacturers and distributors with unique marketing solutions
aspencore.com
Silicon Expert
SiliconExpert provides engineers with the data and insight they need to remove risk from the supply chain.
siliconexpert.com
Transim
Transim powers many of the tools engineers use every day on manufacturers' websites and can develop solutions for any company.
transim.com

Achieving the Compute Performance of the Human Brain

By EETimes  05.29.2014 0

Noted technologist, inventor, and futurist Ray Kurzweil once estimated the processing power of the human brain to be about 10 petaFLOPs — 10 times faster than the first petascale supercomputer that was activated in 2008. Today, many experts believe the processing capacity of the human brain is actually far greater, and some speculate that surpassing the capabilities of the human brain will require the vast processing power of “exascale computing.”

The first exascale supercomputers are expected between 2020 and 2022, prompted through research funded by several governments around the world.

By definition, exascale computing systems deliver at least one exaFLOP of processing power — a thousand-fold increase over a petascale computer, which can deliver at least one quintillion operations per second. However, achieving exascale performance within the next six to eight years is by no means a certainty, as the combined processing power of the world’s top 500 supercomputers — 223 petaFLOP/s, as of June 2013 per the Top500 list — still falls short of a full exaFLOP.

Partner Content
View All
By INTERINE COMPONENTS CO., LIMITED  01.03.2025

Why do we need powerful supercomputers that run at these speeds? More than just matching the processing power of the human brain, powerful supercomputers enable us to better model and predict climate changes, improve medical modeling for personalized medicine, create new drugs in response to rapidly spreading viruses, boost efficiency in aerodynamics and industrial design, and achieve breakthroughs in nuclear physics for controlled fusion and new forms of clean energy.

To do this, however, the technology race to exascale computing will have to contend with another factor where the human brain reigns supreme: power efficiency. The human brain consumes a mere 20 watts of power in exchange for exascale processing potential. In contrast, the power requirements of operating an exascale supercomputer using today’s technologies could be so massive that it would require its own dedicated power-generation plant.

Is it possible that we will reach exascale computing by year 2022? Yes. Is it possible that we will reach the power efficiency of the human brain by that time? Not a chance.

Some researchers promote a “brute force” processing approach to reaching exascale-level performance, harnessing as many standalone processors together as possible to achieve this momentous computing milestone. This approach may prove to be the fastest way to generate the needed level of processing capacity, but it is not sustainable, cost-effective, or reliable, given the complexity and error rates associated with such a massive machine.

Another approach is to keep pushing the envelope through breakthroughs in power-efficient processing — using existing technologies in creative new ways to produce much higher performance at far lower energy consumption. This explains why the US Department of Energy (DOE), which is funding exascale research in power efficiency and other exascale technologies, has made energy efficiency a priority. The agency has encouraged researchers to strive to achieve strict power consumption targets for exascale-class computing architectures and designs.

To Page 2

0 comments
Post Comment
prabhakar_deosthali   2014-05-30 07:59:16

If a common human brain has that computing power, why spend trillions and trillions of dollars on developing the Exascale super computers.?

 

Why not spend a fraction of that money on harnessing the brain power for what scientists want to do?

Like modeling the worl'd's weather pattern, predicting climate chnages and so on?

 

Here we have a few billion of these self duplicating super computers but we do not know how to use them!

 

What we need to develop is Surrogate Computing", hiring one's brain for a specific computing purpose .  And may be, to link up such Surrogate Computers to create a ultra fast computing network! 

 

Just a wild idea!

_hm   2014-06-01 11:36:05

First, it will be interesting to know how is human brain power defined? Is there accepted definition by IEEE or other similar authority?

rfindley   2014-06-03 17:58:37

@_hm, I'm assuming you mean 'computational' power of the human brain, not 'energy' power.

Personally, I think the ever-increasing estimates of the computational ability of the brain are actually headed in the wrong direction.  We are currently in a period of "Moore's Law" of brain knowledge (doubling every year or two), but not so with actual 'understanding' of what we observe.  As a result, our computer-centric theories tend to guide us toward the (I think) false notion that we need more computing power to simulate every nook and cranny of the neuron.

If one applies a more stochastic perspective on brain computation, we can, in theory, actually decrease the estimated computational power of the brain, placing it squarely within reach of today's technology (with a proper understanding of how it works, and of course a big enough budget).

To illustrate in terms more suitable to an EE:  In some cases, a particular A/D converter design might benefit by using a low-res ADC, and increasing its resolution by adding noise, oversampling, and averaging (see here). The brain does similar:  (a) The noise is added via the stochastic nature of neuron formation and connection.  (b) Oversampling occurs in space, rather than time, via an array of stochastically generated neurons clustered together.  (c) Averaging occurs in both time and space via signal summation (though that is a simplification for sake of brevity).

The point of the A/D example is that, when you understand why it's doing all those computations, you might decide that you have the resources to simply use a higher-resolution ADC, thus saving yourself a lot of computation.  Or maybe you can use a totally different sampling method that achieves the same goal. The same is true of the brain.  Its design is well-suited to grey matter, but maybe we can make some different choices better suited to silicon, while achieving the same result.

When we look at neurons without understanding how they work together, we assume that we need to simulate every little bias and noise and geometry of a neuron -- much like simulating every eddy current, leakage, and charge distribution in a transistor.  When you begin to understand the system on a larger scale, you realize such things are unnecessary, or at least can be minimized in the right context.

And most importantly: much of the brain is efficiently idling at any given instant. It is a sparse coding system consisting more of storage than computation (though this, too, is a simplification).  So, we can use our smarts to figure out how to take advantage of that knowledge, resulting in less necessary computation.

apchar   2014-06-03 19:11:19

Lest we forget, the human brain is an analog computer. It takes some serious digital horsepower to do what a summing opamp/comparator can easily do with a few transistors. Emulating an analog system with digital hardware is a tremendous multiplier. Ask any neural net guy.

krisi   2014-06-03 20:19:47

I am not sure I undestand how ADC plays a role in explaining brain power...brain operates entirely in analog domain, what is why it consumes 20W not 5MW that similar digital computation might require...there is no ADC in teh brain! Kris

rfindley   2014-06-03 22:09:52

@krisi, The ADC example has nothing to do with power consumption. Nor is the point that the brain performs ADC operations (though it does, in a way). Rather, it simply illustrates that understanding a system allows you to optimize it for different circumstances.

Perhaps a better example: if I were to design a Playstation emulator on a PC, I might decide to build it as a virtual machine where each machine instruction on the Playstation processor is replaced with a similar instruction on the PC (this is a simplification, of course). But if I knew nothing about how a Playstation processor worked, I might be forced to simulate each individual transistor in that processor. Obviously, that would require massively more computation. This is essentially what is being done with the brain (and understandably so, given the general lack of understanding of the brain). But it doesn't have to be that way.

Researchers are realizing that neurons appear to do a lot more computational work (per neuron) than originally theorized. That is why some folks are increasing their estimates of how much processing would be required to implement a brain in silicon. But I think a significant part of that computation is specific to the grey-matter implementation. When trying to implement the equivalent functions (at the group-of-neurons level) in silicon, it can be implemented in a much more efficient way than simply copying how neurons work to the nth degree.

On a side note, the brain really doesn't operate entirely in analog. It is somewhat of a digital-analog hybrid, plus some aspects that aren't described well by either analog or digital. The brain converts almost all of its input to a quasi-digital code -- quasi because it uses several coding tricks such as pulse-frequency modulation. It has even been shown experimentally that information may be encoded as serialized digital symbols in certain parts of the brain, and are transmitted in a repeating loop in order to minimize the amount of parallel connections required across brain regions!

krisi   2014-06-03 22:55:17

thank you @rfindley for quick and comprehensive answer...I am aware of brain using timing between pulses, to me this is analog computation...digital is basically looking weather there is one or zero...I entirely agree on your hmain point, understanding what exactly neuron does is the key...I am not sure how one goes about finding this out...and out of curosity where is the ADC in the brian? how many bits and what sampling rate? ;-)...Kris

rfindley   2014-06-04 04:23:03

@krisi,

> "out of curosity where is the ADC in the brain?"

In a way, everywhere.  The general principle of neural processing is competition, which naturally pushes away from gradients (i.e. analog) toward 'classification' of stimulus (ones and zeros).  [Interestingly, the quality of analog transmission in the brain is so low that it provably cannot be the sole means of transmitting sensory information].

The ear is a great example.  The first-layer neurons respond in an analog manner to the strength of the peaks of the standing waves captured in the cochlea.  But the successive neuron layers are conditioned to compete for which one is most sensitive to the pattern of peaks associated with a specific frequency.  The result is that, after a few layers of processing, a particular neuron can be ON or OFF based on the presence of a particular frequency.

Amplitude is processed in parallel with a different mechanism, but in a similar manner. Many neurons have a gaussian response function, such that they respond only within a range of volume.  Since neurons are generated stochastically, they each respond most strongly at a different volume level.  So, for any given volume level, a particular pattern of neurons will fire strongly.  Then, at the next layer, a single neuron can recognize the pattern for that volume level, and will turn ON or OFF accordingly.

> "how many bits and what sampling rate?"

The technical answer is long, but it is possible to measure an effective bit resolution and sample rate for various senses.  However, it varies by genetics, usage, and even across the range of a single sensory organ.  For example, we have much higher bit resolution in the mid audio frequencies, because there are more neurons dedicated to that frequency range.  Also, blind people can develop higher auditory bit resolution, because processing of audio expands into the unused visual areas, allowing more 'oversampling' of audio signals in the deeper layers of audio processing.

Also, you can consciously direct an increase in effective bit resolution for some things. It's why we get better with practice.  Attentive processing causes neurons to configure more quickly to distinguish the attended stimuli, thus increasing effective bit resolution.  Nifty, eh?

> "understanding what exactly neuron does is the key"

Yes, exactly.  Or more specifically, understanding what each neural microcircuit does, where a microcircuit is usually made up of a dozen to a few hundred neurons.

> "I am not sure how one goes about finding this out"

That's the billion dollar question, isn't it :-).  I have my own ideas that I'm pursuing, but only time will tell who will be the first to the finish line :-).

krisi   2014-06-04 06:53:15

Fascinating story @rfindley...clearly I have lots to catch-up in my understanding how our brains work electrically...would you be interested in giving a talk on this topic at emerging technologies conference in Vancouver in 2015? details at www.cmosetr.com, pls email me at kris.iniewski@gmail.com

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Related Articles