Breaking News
Comments
Newest First | Oldest First | Threaded View
Page 1 / 2   >   >>
R_Colin_Johnson
User Rank
Blogger
Re: Familiar Architecture
R_Colin_Johnson   8/11/2013 2:31:10 PM
NO RATINGS
Yes, I think IBM's motivation was to implement a conventional neural network architecture, but one that minimized the necessary hardware, thus the analog summing nodes. IBM says its neuron can be implemented with just 1272 gates--pretty economical.

DrFPGA
User Rank
Blogger
Familiar Architecture
DrFPGA   8/11/2013 2:21:01 PM
NO RATINGS
Interesting how similar this architecture is to the old familiar CPLD architectures. Inputs are connected to AND terms and then 'Summed' via OR terms. Outputs are routed to a connectionm matrix. Really the only change is the switch from digital logic to an 'analog' summing neuron. Makes me think a digital version would be much easier to experiment with using an FPGA/CPLD architecture. Perhaps that was phase 0 of the project...

Kinnar
User Rank
CEO
Re: Neural Nets in Hardware
Kinnar   8/11/2013 10:39:53 AM
NO RATINGS
Corelets is talking about two languages one that programs the hardware and the other that programs the software. I think this thread will merge with the multiprocessing and parallel processing techniques, as Corelets is dealing with many decision making elements, we can say Tiny Processors.

 

R_Colin_Johnson
User Rank
Blogger
Re: How hard is it to progran?
R_Colin_Johnson   8/9/2013 1:20:45 PM
NO RATINGS
Yes, neural networks dropped off our radar in the trade press, because all the startup companies either failed or were absorbed by larger corporations who only used their technology for special purposes. However, the International Joint Conference on Neural Networks has continued to make slow, steady progress--especially in the learning methods you mention, which have become quite sophisticated. And now that IBM is backing them, we should finally see the dream start to materialze. By the way, HRL's Center for Neural and Emergent Systems (CNES) is also in DARPA's SyNAPSE program. HRL is using memristors as its artificial synapses for learning:

http://www.eetimes.com/document.asp?doc_id=1264640

R_Colin_Johnson
User Rank
Blogger
Re: 25 years later ...
R_Colin_Johnson   8/9/2013 1:10:23 PM
NO RATINGS
Matt, thanks for your insightful roundup of neural network history and the remarkable opportunity they offer to answer some of the world's most profound questions about the brain and human consciousness. Of course, these questions will not be answered anytime soon, but at least there is now light at the end of the tunnel :)

LarryM99
User Rank
CEO
Re: How hard is it to progran?
LarryM99   8/9/2013 1:08:49 PM
NO RATINGS
I'm not sure that 'program' is the right word to use here. This type of neural networks typically need to be taught rather than programmed. In past efforts of this type the emphasis was on methods for adjusting the reaction of specific simulated neurons to learning input and whether or not you have a separate learning phase or keep learning on while the net is in live operation.

The last time that neural network research was in vogue there were some significant advances, but then it dropped out of the press. I remember the observation at the time that Artificial Intelligence is how it is described only while it is unproven. Once it actually works it just becomes an engineering design method.

MattScottEDA
User Rank
Rookie
25 years later ...
MattScottEDA   8/9/2013 1:03:10 PM
NO RATINGS
Awesome!  We started dreaming of analog neural architectures back in the late 80's ... but were seerely limited by the processes, EDA tools and lack of  programming.  The first book I have is Analog VLSI Implementation of Neural Systems by Carver Mead and Mohammed Ismail.  That ... and a couple professors at Indian University (Prof Jonathan Mills - Lukasiewicz logic arrays, and Gregory Rawlins - Genetic Algorithms) inspired my first design and implementation of an simple modular architecture similar in concept to IBM's ... back in 1989.  That inspired an IEEE best seller book based on our varied implementations of controller for a memory wire controlled 'stiquito' hexapod.  I had hypothesized, that there must be a continous path from the single-cell neuron controller to a cilia, expanding fractally for multi-segment articulated creatures ... via a natural evolution of neural loops. There was no other way - since the primitive creatures had no controller algorithm.  What was amazing, was that the psuedo-evolved neural loops resulted in a set of gaits which exactly matched that seen in nature!  Expanding on that evolutionary path lead to an overall architecture of the mind, based on said loops.  However, at a point of complexity, the loops are not hard-coded in genes, but rather acquired ... with a little boosting from the core loops.   The idea of consciousness and qualia existing as a set of active loops, competing in the non-physical domain for the 'attention loop', was further supported by Alwyns Scott's Soliton's idea.  There was a heated debate between him and the Stuart Hameroff team - whom believe that consciousness resides in quantum conherent states maintained by millions of microtubules in each neuron.   Now .. we shall have the potential to prove Alwyn Scott right (although we can not prove him nor Stuart wrong).  To prove Hameroff/Penrose right, we would need to add nano-structures capable of holding quatum states ... which is also possible now (ie, finfet quantum tunneling).  So - who will win?  Zombie deterministic machines?  Or ethereal universe harmonizing super-Gödel pan-dimensional Schrödinger quatum spirit-minds?  Unfortunately - both are capable of sentience.  But the first one will kill you ... eventually. That is - it will lack the essence of free will, pure creativity, and any connection to God. imho.

Matt - Sr. Intel Engineer ... worrying about nano-level transitor effects.

http://www.matthew-scott.com/prj/ch13/index.htm

R_Colin_Johnson
User Rank
Blogger
Re: Neural Nets in Hardware
R_Colin_Johnson   8/9/2013 9:50:29 AM
NO RATINGS
Yes, neural networks have been knocking around for a couple decades. In fact, back then I wrote a book for John Wiley and Sons--"Cognizers: Neural Networks and Machines that Think" (which I revised recently). IBM is inching closer to the dream of cognitive computers--what I called cognizers--but it will still be many years before they mature enough to go mainstream.

ologic
User Rank
Rookie
Re: How hard is it to progran?
ologic   8/9/2013 3:47:54 AM
NO RATINGS
This is my understanding of how this will work after a cursory look at IBM's documents.

Basically tey seem to have built a very efficient, simple, but very flexible and generic neural network. These are used in two ways.

One way, you use predefined IBM blocks emulated by the generic network these include, according to the article "scalar functions, algebraic, logical, and temporal functions, splitters, aggregators, multiplexers, linear filters, kernel convolution (1D, 2D and 3D data), finite-state machines, non-linear filters, recursive spatio-temporal filters, motion detection, optical flow, saliency detectors and attention circuits, color segmentation, a Discrete Fourier Transform, linear and non-linear classifiers, a
Restricted Boltzmann Machine, a Liquid State Machine, and more.".

I suspect that using these will be a bit like using some very advanced analog blocks even though the underlining architecture is digital.

The other way of using the corelets is to define your own blocks and I think this will be for more advance users.

All these module can be connected into more complex functions.

It is definitely a paradigm shift from normal digital design and programming, close to an analog/digital FPGA capable of very complex functions. I've ever used Matlab, but from the little I know, a Matlab user would be familiar with at least some of the modus operandi.

Susan Rambo
User Rank
Blogger
Re: Neural Nets in Hardware
Susan Rambo   8/9/2013 12:24:40 AM
NO RATINGS
Interesting observation.  This could be the story of computing?  Reinvention and not so much invention.  Don't we see this over and again?  Reusable code was renamed SOA!  Social media bundled age old technologies into a single platform!  Just to name a couple.

Page 1 / 2   >   >>


Most Recent Comments
EE Life
Frankenstein's Fix, Teardowns, Sideshows, Design Contests, Reader Content & More
Max Maxfield

Creating a Vetinari Clock Using Antique Analog Meters
Max Maxfield
55 comments
As you may recall, the Mighty Hamster (a.k.a. Mike Field) graced my humble office with a visit a couple of weeks ago. (See All Hail the Mighty Hamster.) While he was here, Hamster noticed ...

EDN Staff

11 Summer Vacation Spots for Engineers
EDN Staff
11 comments
This collection of places from technology history, museums, and modern marvels is a roadmap for an engineering adventure that will take you around the world. Here are just a few spots ...

Glen Chenier

Engineers Solve Analog/Digital Problem, Invent Creative Expletives
Glen Chenier
11 comments
- An analog engineer and a digital engineer join forces, use their respective skills, and pull a few bunnies out of a hat to troubleshoot a system with which they are completely ...

Larry Desjardin

Engineers Should Study Finance: 5 Reasons Why
Larry Desjardin
45 comments
I'm a big proponent of engineers learning financial basics. Why? Because engineers are making decisions all the time, in multiple ways. Having a good financial understanding guides these ...

Flash Poll
Top Comments of the Week
Like Us on Facebook
EE Times on Twitter
EE Times Twitter Feed

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)