All this to attempt to replicate the functions of a 3-lb (1.4 kg) brain that dissipates 50 watts--think about that.
And it still won't be able to work out most things which the brain does.
From what i understand , this is about understanding how to build a 1000 trillion link neuron communication system, and less about understanding how the total brain works.
But we are improving in understanding how the brain works. As one example of this improvement, is the technique of "Deep Learning", which is based on neuroscience. It uses a single a algorithm that can learn diverse tasks (mainly in perception) just by feeding it a lot of data. the same algorithm basically can learn speech recognition , image recognition , text understanding and other tasks at a really good accuracy , competing with best algorithms designed by experts.
"We don't know how the brain works as an information-processing system, and we do need to find out." Steve Furber could have reduced his ignorance by studying some of the research of Walter Freeman :
Dr Freeman's work does not support the view that the brain processes information.
I did not get the impression that "Dr Freeman's work does not support the view that the brain processes information." from browsing this website. It is merely trying to fathom "how" the brain process information.
Well there's a huge amount there - a quick browse is unlikely to do his work justice. OTOH his book explains why the 'cognitivist model' (information processing viewpoint of the brain) does not hold water. So if you're interested you could begin with that:
1% of brain with 1 million ARM cores is good. But how about memory? Does human brain have much more memory? Also, other part is learning, decision making and various moods of human. That will be interesting part of it.
This simulation will find out how the brain processes the information. It will be also interesting how the memory works. Because the human memory also seems to have an unlimited capacity and almost instantaneous searching speed ( many times it will beat Google!)
Agreed @pixies...we forget things so we can remember new ones, this has been documented very well in neuroscience...but there a few people in world where the forgetting mechanism doesn't work, I watched interviews with them on PBS, amazing, they remember almost everything that happened in their lives so brain memory capacity is fairly large(but their life is actually pretty miserable as they suffer from information overload)...Kris
I "recall" reading some medical literature that sleep provides the normal defragmenting and refresh process by which the brain reemphasizes desirable memories.
Such theoretical assessments aligns with technical processes experienced in computer programming ( memory leaks and stack pointer overruns, etc ), even if the human brain has a large memory capacity, it is the ability to recollect at will, for a successful life, and not one with overburden flow of regretful and sad memories.
I believe the brain depends on "impressions" to build memory (or determine what will be stored).
For instance, if you see a face regularly, you'll never forget it because it is "impressed" on your brain from time to time.
However, recognition is sometimes slow if you have not had many impressions or a lot of time has passed.
It is this reconstruction (slow recognition) by the brain that is really hard to explain ... Was that piece of old information actually there? Or was just enough information to enable reconstruction, all that was stored in the brain?
Think of human memory as an analogue IIR filter, rather than a digital FIR, and you get roughly the right idea.
Human memory is limited, but (for most people) rather large. It is also very efficient in storage - it remembers things in relation to other things, rather than "raw data". And your recall mechanism is mixed in with your imagination - if you can't remember details, your brain can make them up.
Also, we have to add power for other parts like memory, serial XCVR and other other losses. Power supply design will be interesting. I reember 30 kW power supply used by CERN. What will be approximate size of system?
Peter: If this hardware system is comparable to Bumble Bee how are we supposed to learn anything about human brain information processing? I would like to think we human beings are a little more complex than bees ;-)...IBM project claimed a cat brain complexity although even that claim was hotly disputed by another IBM group...Kris
Well, regardless of bumblebee or gnat, Prof. Furber's team claim that SpiNNaker can model 1% of the human brain. Maybe they hope to model different parts of human brain function, at different times, such as sight or hearing, or memory functions.
On behalf of those of you concerned about the power consumption of SpiNNaker, I fired off an email to Professor Furber and below is his reply.
"We expect the full million processor machine
to consume around 50kW to 100kW when all processors are running flat-out.
We don't expect them all to run flat-out very often, and the software is event-driven in a way that means that a processor with a 50% load will use 50% of peak power, so typical power consumption will be lower than above, but we have to provision for the peak.
Is it practical? The million processor machine will occupy several cabinets; at least 6 to 8, possibly more if the power density turns out to be an issue. Engineering details for the largest machines are still being worked on, but by adopting blade server or cluster computer-style distributed power supplies and cooling this doesn't seem out of line with established data-centre practice and power-densities.
thank you Peter and Dr Furber, that clarifies the power issues...high power but not impossible to deal with of course...still humble comparison to human brain of 30W or so, and for the bee I guess it would much less so...good luck to the project, hope we learn something interesting when it is build and used although I still doubt whether this really represent 1% of the human brain complexity...Kris
One million core simulates 1% percent of human brain. This really mean every one of us carry a 100 million core in our head. This is the most complex, most powerful embedded system in all world. I feel we are wasting this enormous power. The machine is truly within us. Why should we simulate to understand the internal working. Just think...
To @vasanth_d, true the human brain is the most complex system in the world, at least for now (that might be not be true in 30 years)...but how do you understand how it operates by just thinking? It seems to me that people have been thinking about the brain for very long time but only very recently with progress in neuroscience and ability to observe (thru fMRI) some decent understanding has started to slowly emerge...building electronic equivalent, however small and simple in comparison to the real thing, might aid in that understanding...Kris
This is a very important news. The brain is one of the most dificult things to study because we only have a tool similar as the Unit Under Test to study it... and that is our brains. I think an important mention in the article is that the brain neurons communicate with analog signals and to my knowledge all processing power we have in the electronics world is done through digital signals and digital microprocessors. Would a neuron be really like a microprocessor? will it just process bits of data? The links and information found in the comments is actually very interesting also... We'll wait for the outcome of the research which most probably will last several years.
Luis, this is ARM core so it has to be processing of digital bits...but there is some cool research at Stanford where signal processing is in analog domain, their plenary talk is posted at www.cmoset.com (click on 2011 conference)...Kris
I am not very familiar with the objectives of this project and its details. But for me from the above discussions, it appears we are trying to model a ASIC of today's complexity with transitor level details to understand some of its functionality(Many of you might strongly say that we might not reach anywhere with this approach immediately!)
For human brain, we may try this approach to get somewhere as we may not be not know how to model it in a better way today.
I suspect the answer is that you cannot.
Even if you approached Professor Furber with a deal -- say the donation of a shiny new building for the University of Manchester to be called the Brown-Furber School of IT -- you might find that Professor Furber's hands are tied by the licensing terms he agreed with ARM.
But there is nothing to stop you taking a license and designing your own many-core ARM device. Professor Furber has shown that 18 cores plus loads of memory is possible in a 130-micron process. What could you achieve at 32/28-nm or 22/20-nm?
There is just one tiny little hinder to designing my own ARM device - money!
There are lots of SoC devices available from different manufacturers, with all sorts of different cores. But there are not many that have a decent amount of memory in the same package. The idea of a single package containing a decent CPU (single or multi-core) and plenty of memory is very appealing - it would be smaller and easier to use than separate chips.
You are telling me that there is a market need; Professor Furber is telling me it is physically possible.
Therefore, market economics dictates that someone will go to the venture capital community (or to a corporate investor such as Samsung, Qualcomm or even ARM) and raise capital on the strength of the idea.
Indeed it is likely that someone already did and is being stealthy. We will try to find them for you.
There used to be the CRIS chips from Axis, but these seem to be discontinued.
It just strikes me that when you need a processor like an ARM (or MIPS, PPC, Coldfire, etc.) with more memory than you can get with a microcontroller, then you are going to need the CPU, DRAM of some sort, and Flash of some sort. If someone were to put all these modules inside one package, it would save a lot of effort and board space for many users.
If you find out about any suppliers that make such packages - and are happy to sell to small companies - I'm sure it would make an interesting article.
The SpiNNaker team is forgetting one important things about the human brain. The Soul...
The core of the brain is the human Soul. Although, the brain has limited memory capacity, the soul has unlimited capacity. Man would never be able to understand the complexities of the human brain and all its functions, unless man humbles himself before the only ONE who truly knows the brain in and out, because HE invented it. GOD.
That seems more like unprovable pseudo-science at this stage of mankind's knowledge. Where's the hard data backing up these claims of the soul's existence and unlimited "capacity"? It certainly may well exist, but until it can be detected, measured and it's properties well understood, it can't be used in the context of being the driving force behind the brain's capabilities. One must use hard, empirical data to back up one's claims - not speculation bordering on fantasy, when proving/disproving a scientific theory.
Yes, we do need hard data you're totally correct. So where is the hard data for the assertion that the brain is an information processing system. I'm aware of none, but if you know of some, please post a link.
Although I am impressed, and overwhelmed, by the sheer size of this marvel, I can't help thinking of another approach.
Why expend such massive amounts of digital silicone to duplicate an analog function?
I read all the time of nanotechnology and how small op amps and digital gates are being made.
Why not manufacture opamp based neurons with about 100 conductive synapses each. Each could be programmed to perform in predefined ways.
Then put them together by the thousands, or millions, and start teaching them?
wouldn't the end product (knowledge gained) be far more valuable than how a million cpu's did something?
I don't see much material in the news or blogs concerning neural chips or research, so I just thought I'd ask experts like you.
Your suggestion does circumvent the unsupported notion that the brain processes information. However your proposal that the networks be taught is very un-brain-like. Brains are not taught, rather they teach themselves.
This is a well known idea @wirenom...people have tried to use programmable hardware to be trained or evolved on its own by learning...it didn't produce anything useful yet, most research papers are actually doing this in software which misses the point (as software is ultimately executed on a digital computer)...My bet is that in 10 years you will see something interesting in this field...Kris
P.s All above development are based on CMOS based analog circuits, nanotechnology is not yet useful for that functionality, it still tries to deliver one transistor, but it will get there eventually
I'm sure they will. In the early 80's, a new technology called lcd screens came out. They were very low resolution and only gray scale (or amber scale).
I suggested that progress would improve them to color and higher resolutions. The response at the time was "IN YOUR DREAMS".
Well, look at it now!
Thanks for your response.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.