Background: IBM's brain-inspired architecture consists of a network of neurosynaptic cores. Cores are distributed and operate in parallel. Cores operate—without a clock—in an event-driven fashion. Cores integrate memory, computation, and communication. Individual cores can fail and yet, like the brain, the architecture can still function. Cores on the same chip communicate with one another via an on-chip event-driven network. Chips communicate via an inter-chip interface leading to seamless scalability like the cortex, enabling creation of scalable neuromorphic systems.
Background: A video camera on Hoover Tower at Stanford University is looking down at the plaza, below. A simulated network of IBM TrueNorth chips takes in the video data and locates interesting objects. Objects might look interesting to the system because they are moving or have a different color or texture than the background. The system then further processes those portions of the interesting video to determine what the objects are. It is trained in several specific categories, such as buses, cars, people, and cyclists. In a monitoring application, the camera would only need to communicate when it found an interesting object, rather than continually streaming video to a central location.
I am also of the opinion that suc h a complex chip will require much more advanced programming language and development environment for the application engineers to be able to develop working products .
I would not be surprised that in a current scenario , a supercomputer will be erquired to program such a complex chip!
I really love the concept. But what I'm really looking forward to is seeing what sort of real life applications this chip can do. The programming language and environment is also something which can be pretty complex for more the more 'hands-on' engineer.
I dabbled in Neural networks a bit before and experimented by replacing traditional control loops with the neural ones but the results are pretty underwhelming. Sure the neural networks were better but not by much and the sheer amount of data sets I needed plus the effort to train it offline wasn't all that appealing to me.
Perhaps the developing environment can take a bit of sting off the process. Anyone else use neural networks for more complex stuff?
The comment about chip versus the politicians is not correct! You must be talking about USA politicans who have no brains at all. Vote for Netanyahu or Hero Putin if you want brains and action! IBM and Apple will change the way computing occurs at the personal level. Great going IBM!!
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.