Breaking News
News & Analysis

Panel ponders many-core ICs tripping 'the singularity'

3/28/2012 04:09 PM EDT
9 comments
NO RATINGS
Page 1 / 2 Next >
More Related Links
View Comments: Newest First | Oldest First | Threaded View
Mxv
User Rank
CEO
re: Panel ponders many-core ICs tripping 'the singularity'
Mxv   4/3/2012 10:00:04 PM
NO RATINGS
@defendor: You're the idiot. Come up with a better plan, genius.

KeithSchaub
User Rank
Manager
re: Panel ponders many-core ICs tripping 'the singularity'
KeithSchaub   4/3/2012 7:49:19 PM
NO RATINGS
BTW We already know how to make birds http://blog.ted.com/2011/07/22/wow-smartbird-in-the-wild-swarmed-by-seagulls/

moloned
User Rank
Rookie
re: Panel ponders many-core ICs tripping 'the singularity'
moloned   4/2/2012 4:21:56 PM
NO RATINGS
The big problem with this argument is that we don't know how the human brain works yet so how on earth can we simulate it? The second problem is that even if we did know how it worked the human brain dissipates about 25W, or about 1 millionth of what an exaFLOP computer would require. This limitation means that very few of these artificial "brains" will be built until our technology can rival the efficiency of the brain. Even assuming Moore's law applied (doubling performance every 18 months) it would take 36 years from the moment we have a "brain-computer" to the moment it was as efficient as a human brain. Moral of the story ... don't hold your breath!

Nicholas.Lee
User Rank
Rookie
re: Panel ponders many-core ICs tripping 'the singularity'
Nicholas.Lee   4/1/2012 11:09:39 PM
NO RATINGS
Step 1: You make a beefburger, it doesn't moo. Step 2: Yet make a series of bigger and bigger beefburgers, they still don't moo. Step 3: You put multiple beef burgers in the same bun, it still won't moo. Conclusion: It's not the speed, size or amount of beef cells you have, it's how is it connected together that determines whether it will moo or not. I.e. True AI won't be achieved by having greater processing power alone. We need to solve the hard problem of working out the right architecture first.

DrQuine
User Rank
CEO
re: Panel ponders many-core ICs tripping 'the singularity'
DrQuine   3/30/2012 2:27:03 AM
NO RATINGS
I see a key distinction between "expected / known" and "unexpected / unknown" problem solving. Computers are complete champions in arithmetic - most of us happily hand over such tasks because the computer doesn't get tired, distracted, or careless. On the other hand, open ended inference problems are much more efficiently solved by humans - perhaps using the computer as an information retrieval engine to gather appropriate data. As computers "learn new tricks", that boundary will continue to shift.

PJames
User Rank
Rookie
re: Panel ponders many-core ICs tripping 'the singularity'
PJames   3/29/2012 10:13:19 PM
NO RATINGS
Do you have something else in mind? The amount of parallelism within a single thread is often fairly limited and modern processors with multiple issue and speculative execution are bumping up against diminishing returns. If you go to multiple threads, it simply becomes a tradeoff of whether it is more efficient to make a single "core" execute more and more threads or simply replicate the core. More transistors means more parallelism. If the basic model of computing as sequences of logical and arithmetic functions is retained, one can carve up that parallelism at different levels but the results remain largely just an issue of optimization rather than some radically better vision.

BrainiacV
User Rank
Rookie
re: Panel ponders many-core ICs tripping 'the singularity'
BrainiacV   3/29/2012 2:33:40 PM
NO RATINGS
I've always argued along the lines of "We can build planes but we can't build birds." We achieve the same function, but through different means. Machine intelligence will be different than human intelligence. Planes don't fly with the grace of birds. That gray squishy thing in our skulls is not just driven by the interconnections, but is also influenced by the chemicals flowing through it. We forget things, the computers wouldn't. But is that forgetfulness part of how we function? I look forward to true AI, but I don't expect it to be something that can really pass a Turing test, anymore than I expect a plane to perch in a tree.

defendor
User Rank
Rookie
re: Panel ponders many-core ICs tripping 'the singularity'
defendor   3/29/2012 12:45:30 PM
NO RATINGS
This guy especially sounds like an idiot: Pradeep Dubey.

defendor
User Rank
Rookie
re: Panel ponders many-core ICs tripping 'the singularity'
defendor   3/29/2012 12:42:20 PM
NO RATINGS
Wondering how they haven't all really solved the problem with these great Geniuses thinking about it. Title should read: "self-procalmed multi-core experts panel don't know know jack about artifical intellience". switching to multi-core design is really an admission of the intellectual bankrupcy... well.. we can't really figure out how to make a better microprocessor, its too acedemically difficult to think about with the types of people we hire these days, so let's just do the obvious thing and plunk down as many of them on a chip as we possibily can fit.

Top Comments of the Week
August Cartoon Caption Winner!
August Cartoon Caption Winner!
"All the King's horses and all the KIng's men gave up on Humpty, so they handed the problem off to Engineering."
5 comments
Like Us on Facebook

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)
EE Times on Twitter
EE Times Twitter Feed
Radio
LATEST ARCHIVED BROADCAST
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.
Flash Poll