From what i understand , this is about understanding how to build a 1000 trillion link neuron communication system, and less about understanding how the total brain works.
But we are improving in understanding how the brain works. As one example of this improvement, is the technique of "Deep Learning", which is based on neuroscience. It uses a single a algorithm that can learn diverse tasks (mainly in perception) just by feeding it a lot of data. the same algorithm basically can learn speech recognition , image recognition , text understanding and other tasks at a really good accuracy , competing with best algorithms designed by experts.
I "recall" reading some medical literature that sleep provides the normal defragmenting and refresh process by which the brain reemphasizes desirable memories.
Such theoretical assessments aligns with technical processes experienced in computer programming ( memory leaks and stack pointer overruns, etc ), even if the human brain has a large memory capacity, it is the ability to recollect at will, for a successful life, and not one with overburden flow of regretful and sad memories.
Your suggestion does circumvent the unsupported notion that the brain processes information. However your proposal that the networks be taught is very un-brain-like. Brains are not taught, rather they teach themselves.
Yes, we do need hard data you're totally correct. So where is the hard data for the assertion that the brain is an information processing system. I'm aware of none, but if you know of some, please post a link.
I'm sure they will. In the early 80's, a new technology called lcd screens came out. They were very low resolution and only gray scale (or amber scale).
I suggested that progress would improve them to color and higher resolutions. The response at the time was "IN YOUR DREAMS".
Well, look at it now!
Thanks for your response.
This is a well known idea @wirenom...people have tried to use programmable hardware to be trained or evolved on its own by learning...it didn't produce anything useful yet, most research papers are actually doing this in software which misses the point (as software is ultimately executed on a digital computer)...My bet is that in 10 years you will see something interesting in this field...Kris
P.s All above development are based on CMOS based analog circuits, nanotechnology is not yet useful for that functionality, it still tries to deliver one transistor, but it will get there eventually
Although I am impressed, and overwhelmed, by the sheer size of this marvel, I can't help thinking of another approach.
Why expend such massive amounts of digital silicone to duplicate an analog function?
I read all the time of nanotechnology and how small op amps and digital gates are being made.
Why not manufacture opamp based neurons with about 100 conductive synapses each. Each could be programmed to perform in predefined ways.
Then put them together by the thousands, or millions, and start teaching them?
wouldn't the end product (knowledge gained) be far more valuable than how a million cpu's did something?
I don't see much material in the news or blogs concerning neural chips or research, so I just thought I'd ask experts like you.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.