Breaking News
Comments
Newest First | Oldest First | Threaded View
mike655mm
User Rank
Rookie
re: Intel claims MIC beats GPUs for parallelism
mike655mm   12/1/2011 11:57:51 PM
NO RATINGS
So Sylvie, a newbie are you ? I've been watching rapid CPU advances as an IC designer for almost 40 years. While I certainly don't expect a MIC-in-workstation in 2012, you can bet it'll get there at some point. 22nm is arriving in just a few months. While a supercomputer will use over a thousand MICs to generate 1 petaflop, you only need 1 in a Xeon-based workstation today to produce 1 teraflop. I think there are plenty of labs that could use this. While a single MIC workstation isn't likely a big priority for Intel now, you can bet it's on the list in the next year or 2. Adding a single co-processor wouldn't be hard and it's already supported by software.

KarlS
User Rank
Rookie
re: Intel claims MIC beats GPUs for parallelism
KarlS   11/25/2011 3:23:41 PM
NO RATINGS
Well, I am a systems as well as a HW guy. Both MIC and GPU first bring raw data into memory and it does not matter whether a core or GPU processes it, it must be taken from memory. The video refers this data movement as a problem for GPU only. This is typical sales hype. A better approach is to bring the raw data into LOCAL memory and do the processing in the GPU or some other PU, preferably one programmed in openCL. the only data movement is processed data into main memory. Yes, a work unit must be passed to a GPU along with the raw data, but since the same processing is applied to different data over and over, the code should reside in local memory, eliminating that memory transfer.

Mxv
User Rank
CEO
re: Intel claims MIC beats GPUs for parallelism
Mxv   11/24/2011 3:27:58 AM
NO RATINGS
Let's see - a bunch of 386 cores with no DMA or onboard I/O except for PCIe. No seperate buses to connect the cores (just shared memory). No cost amortization from the graphic business. Yeah, sounds like a real winner. I could care less about GPUs -- I'm a HW guy. If the FPGA vendors would drop their prices -- I could design cost competitive accelerators that would run circles around these multicore heaters.

markhahn0
User Rank
Rookie
re: Intel claims MIC beats GPUs for parallelism
markhahn0   11/23/2011 8:47:14 PM
NO RATINGS
I'd be surprised if Intel didn't offer MIC in 2012: it'll probably be a card of similar size, price and power dissipation as nvidia tesla. whether Intel offers a line similar to gforce (that is, "desktop-priced" for $300 rather than $3000) remains to be seen. I'm not sure why you'd swap a server for it though: it's not designed for server workloads. it's designed for compute-intensive stuff.

SylvieBarak
User Rank
Rookie
re: Intel claims MIC beats GPUs for parallelism
SylvieBarak   11/23/2011 6:01:23 AM
NO RATINGS
Wow, how loaded are you, Mike?? ;) Cause that shizzle ain't gonna come cheap :) (Then again, neither does a Xeon....)

mike655mm
User Rank
Rookie
re: Intel claims MIC beats GPUs for parallelism
mike655mm   11/22/2011 11:51:28 PM
NO RATINGS
Wow. How close am I to swapping my Xeon-based server box in my office to one with a single 50-core MIC co-processor that's capable of 1 TFLOPs ?



Most Recent Comments
Susan Rambo
 
y88games
 
Kinnar
 
y88games
 
AZskibum
 
AZskibum
 
AZskibum
 
sixscrews
 
collin0
Most Recent Messages
8/29/2014
5:26:13 AM
EE Life
Frankenstein's Fix, Teardowns, Sideshows, Design Contests, Reader Content & More
Max Maxfield

Aging Brass: Cow Poop vs. Horse Doo-Doo
Max Maxfield
34 comments
As you may recall, one of the things I want to do with the brass panels I'm using in my Inamorata Prognostication Engine is to make them look really old. Since everything is being mounted ...

latest comment y88games Thank you so much for sharing.
EDN Staff

11 Summer Vacation Spots for Engineers
EDN Staff
11 comments
This collection of places from technology history, museums, and modern marvels is a roadmap for an engineering adventure that will take you around the world. Here are just a few spots ...

Glen Chenier

Engineers Solve Analog/Digital Problem, Invent Creative Expletives
Glen Chenier
11 comments
- An analog engineer and a digital engineer join forces, use their respective skills, and pull a few bunnies out of a hat to troubleshoot a system with which they are completely ...

Larry Desjardin

Engineers Should Study Finance: 5 Reasons Why
Larry Desjardin
45 comments
I'm a big proponent of engineers learning financial basics. Why? Because engineers are making decisions all the time, in multiple ways. Having a good financial understanding guides these ...

Flash Poll
Like Us on Facebook
EE Times on Twitter
EE Times Twitter Feed

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)