Breaking News
Comments
Newest First | Oldest First | Threaded View
<<   <   Page 2 / 3   >   >>
Robotics Developer
User Rank
Rookie
Re: chip technology?
Robotics Developer   10/14/2013 3:40:05 PM
NO RATINGS
I think that there are spiking neural network models availible on line somewhere (I remember seeing a link a while ago).  http://www.izhikevich.org/publications/spikes.htm

Has links to the models and some of the early work being done in NN.  Pretty cool if it really does work.  Looking forward to it.

moloned
User Rank
Rookie
Re: chip technology?
moloned   10/14/2013 6:26:46 AM
NO RATINGS
Of course a super-computer is not the only way to simulate the human brain and a domain-specific programmable SoC is one way of doing it.

At 34:40 at the end of Eugenio Culurciellos presentation on his vision SoC you can hear me asking a question about power dissipation of his system (I presented #movidius #myriad at the same conference)

The required 7500 chips in 2011 would dissipate a total of 15kW compared with the 25W power dissipation of the brain

Power is now down to about .6W according to his latest paper but this is still kW  

https://engineering.purdue.edu/elab/blog/wp-content/uploads/2011/11/pham-mwscas-12.pdf

This is a long, long way from what Qualcomm claim they can fit in a handset where 3-4W is the total for the complete smartphone including PA, baseband, WiFi/Bluetooth/GPS etc., display, Android OS and last but not least the budget for applications of 600-700mW

Kinnar
User Rank
CEO
Very Remarkable results obtained..
Kinnar   10/14/2013 5:20:22 AM
NO RATINGS
Very remarkable results are obtained by the Purdue University professor, it seems a really working technology, yet it is not much explained how the individual NPUs are working as neurons, but these technique will be having many roles to be played beyond it is explained in the article and explained by Qualcomm

moloned
User Rank
Rookie
Re: chip technology?
moloned   10/14/2013 4:57:50 AM
NO RATINGS
"The brain is also very power efficient, he explained, consuming only about 20 watts at a cost of under a quarter of a cent per hour, whereas simulating the brain on a conventional von Neumann computer would take up to 50 times more power"

This statement is badly wrong and about 5-6 orders of magnitude off both in FLOPS and Watts!

This article reports an 83000 processor cupercomputer being able to deliver about 1% of the calculations performed by the brain and such super computers typically dissipate megawatts!

http://gizmodo.com/an-83-000-processor-supercomputer-only-matched-one-perc-1045026757

According to the German supercomputer centre in Juelich it will take an exaFLOP machine to simulate the entire brain in about 2020 with a power budget on the order of 20MW

Given current supercomputers can only manage about 10GFLOPS/W even this figure is in considerable doubt and would require almost 2 orders of magniture improvement in FLOPS/W in the next 10 years wit Moore's law and supply voltages plateauing

moloned
User Rank
Rookie
Re: chip technology?
moloned   10/14/2013 4:44:19 AM
NO RATINGS
More hype than substance I'd say at least when it comes to the claims about a neural network processing unit (NPU) in hardware.  

Looking at the braincorp jobs page http://braincorporation.com/index.php/category/opportunities/ these guys are hosted inside Qualcomm and are focused on building robots, robotics algorithms and tools.  

They are hosted inside Qualcomm and appear to be leveraging Qualcomm's GPUs using OpenGL and OpenCL, as well as C/C++, Python and Matlab to to the NN processing rather than some kind of specialised NN processor.  

I'd say most of the NN work is being done in Matlab with some kind of back-end code generation of C/C++ code or potentially OpenGL/OpenCL ES code, the smart approach would be to have libraries for the GPU which are optimised at low level and be able to call identical libraries within Matlab or Python for rapid development rather than designing an esoteric compiler.

I'd guess this may lead to adding NPU support in the ISA of future GPUs at some point if they really need it

rick merritt
User Rank
Author
Re: The role of federal research funding
rick merritt   10/13/2013 3:27:36 PM
NO RATINGS
Good to see Qcomm pumping some of its profits into long term research

Terry.Bollinger
User Rank
Manager
The role of federal research funding
Terry.Bollinger   10/13/2013 1:48:54 PM
NO RATINGS
When I first saw this I thought, "Rats! Why can't all of the fantastic and very similar work that Eugenio Culurciello has been doing for years under Office of Naval Research (ONR) funding get this kind of press...?" This also shows why one should read figure captions. From said caption on the familiar-looking first figure I finally realized that this _is_ the next phase for Dr Culurciello's amazing chips! Since shutting down the government seems to be the theme-du-jour, it's worth pointing out just how huge a role federal funding plays in giving folks like Dr Culurciello a chance to move the ball on some wild new idea to the point where a large company like Qualcomm can see the potential and catch the pass. That's the kind of teaming where everyone benefits.

R_Colin_Johnson
User Rank
Blogger
Re: NPU for automotive?
R_Colin_Johnson   10/13/2013 12:54:44 PM
NO RATINGS
@ NPU for automotive? 
junko.yoshida 

Next year Qualcomm will release its suite of software tools that work with an FPGA emulator for developers to use when creating applictions for its NPU. Regarding automotive, Purdue University professor Eugenio Culurciello has already shown that recognition of roadside scenes can result in realtime classification into pedestrians, vehicles, buildings, etc., but I suspect it will take a year or two for developers to start making good use of this type of information in collision avoidance and similar automotive applications.

R_Colin_Johnson
User Rank
Blogger
Re: chip technology?
R_Colin_Johnson   10/13/2013 12:46:08 PM
NO RATINGS
Qualcomm is not revealing many details yet, but is using a spiking neural network model--a digital representation that is compatible with standard CMOS processing.

_hm
User Rank
CEO
Good news but add optical processing to it
_hm   10/12/2013 7:46:49 PM
NO RATINGS
It is almost two decades. But now, it is a good news.

Can Qualcomm add optical processing to it? It will give real edge for future of computing.

<<   <   Page 2 / 3   >   >>


Most Recent Comments
EE Life
Frankenstein's Fix, Teardowns, Sideshows, Design Contests, Reader Content & More
Max Maxfield

Feast Your Orbs on My Jiggly Exercise Machine
Max Maxfield
49 comments
Last weekend, I was chatting with my mother on the phone. She's all excited that I'm coming over to visit for a week in November. "I'll be seeing you in only seven weeks," she trilled ...

Glen Chenier

Missing Datasheet Details Can Cause Problems
Glen Chenier
3 comments
It is often said that "the devil is in the details." All too often those details are hidden deep within a datasheet, where you can easily overlook them. When a datasheet reference circuit ...

David Blaza

RadioShack: The End Is Nigh!
David Blaza
122 comments
I'm feeling a little nostalgic today as I read about what looks like the imminent demise of RadioShack, at least as we currently know it. An old ubiquitous cartoon image popped into my ...

Larry Desjardin

Engineers Should Study Finance: 5 Reasons Why
Larry Desjardin
46 comments
I'm a big proponent of engineers learning financial basics. Why? Because engineers are making decisions all the time, in multiple ways. Having a good financial understanding guides these ...

Flash Poll
Top Comments of the Week
Like Us on Facebook
EE Times on Twitter
EE Times Twitter Feed

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)