Breaking News
Blog

Inside Intelís Haswell with tour guide David Kanter

NO RATINGS
View Comments: Newest First | Oldest First | Threaded View
Page 1 / 2   >   >>
danny1024
User Rank
Rookie
re: Inside Intelís Haswell with tour guide David Kanter
danny1024   11/20/2012 3:11:14 AM
NO RATINGS
Perhaps a magic information fairy pointed him to an Intel Technology Journal Article "TERA-SCALE MEMORY CHALLENGES AND SOLUTIONS" http://download.intel.com/technology/itj/2009/v13i4/pdfs/ITJ9.4.7_MemoryChallenges.pdf And this corresponding patent: "Systems, methods, and apparatuses for hybrid memory" http://www.google.com/patents/US20110161748

rick merritt
User Rank
Author
re: Inside Intelís Haswell with tour guide David Kanter
rick merritt   11/16/2012 4:55:48 PM
NO RATINGS
Sounds similar to what Huawei/Altera are doing and what I expect other comms and server OEMs will try out over the next year or so. http://www.eetimes.com/electronics-news/4401446/Huawei--Altera-mix-FPGA--memory-in-2-5-D-device

chipmonk0
User Rank
CEO
re: Inside Intelís Haswell with tour guide David Kanter
chipmonk0   11/15/2012 5:04:06 PM
NO RATINGS
informed guess - must leave it at that !

rick merritt
User Rank
Author
re: Inside Intelís Haswell with tour guide David Kanter
rick merritt   11/15/2012 4:23:19 PM
NO RATINGS
That's big news @chipmonk. What's your source on it?

chipmonk0
User Rank
CEO
re: Inside Intelís Haswell with tour guide David Kanter
chipmonk0   11/14/2012 10:39:15 PM
NO RATINGS
Haswell is going to be a 2.5 d module with the Level 4 cache chip next to the processor, the chips connected by fine-pitch high-density thin film interconnects on the Si substrate of the module. Will have lots of interconnects, enabling lots of parallelism in memory accesss by multi - core in CPU / SoC. BTW won't be able to stack chips ( true 3D ) because need to take heat out of the 10 watt CPU.

markhahn0
User Rank
Rookie
re: Inside Intelís Haswell with tour guide David Kanter
markhahn0   11/14/2012 9:49:35 PM
NO RATINGS
stacking memory on the CPU makes a lot of sense for any lower-power chip - after all, it's pretty routine in phones. stacking not only gives a performance boost, but saves some power. probably hard to do with a bigger/hotter chip, though. 10W is certainly workable for a tablet, as long as it can race to sleep, low leakage, etc. I just wish AMD would grow some balls and produce, for instance, an APU with stacked dram so you could tile a bunch of them onto a board. whatever happened to the idea of scalable multiprocessor systems anyway? (with builtin scalable GPU for free!)

Doug_S
User Rank
CEO
re: Inside Intelís Haswell with tour guide David Kanter
Doug_S   11/14/2012 9:19:55 PM
NO RATINGS
You sure about that Sylvie? 10 watts seems like way too much for a tablet, it'll either burn you or require a fan. There won't be a lot of buyers for a tablet with a fan, even if it is more powerful.

SylvieBarak
User Rank
Rookie
re: Inside Intelís Haswell with tour guide David Kanter
SylvieBarak   11/14/2012 7:19:25 PM
NO RATINGS
Can't wait for Haswell tablets! Now THOSE will be tablets worth buying in my opinion...

resistion
User Rank
CEO
re: Inside Intelís Haswell with tour guide David Kanter
resistion   11/14/2012 5:13:21 PM
NO RATINGS
It's still a little rumor-like to me. I also saw Anand report it as embedded DRAM as if on-chip, but if it is off- chip, would they use TSV for the speed? Isn't the normal course to make it on-chip SRAM?

rick merritt
User Rank
Author
re: Inside Intelís Haswell with tour guide David Kanter
rick merritt   11/14/2012 3:00:55 PM
NO RATINGS
Interesting and something I have never heard of in a CPU before--an L4 cache, let alone and off-chip one.

Page 1 / 2   >   >>
Most Recent Comments
Flash Poll
Radio
LATEST ARCHIVED BROADCAST
EE Times editor Junko Yoshida grills two executives --Rick Walker, senior product marketing manager for IoT and home automation for CSR, and Jim Reich, CTO and co-founder at Palatehome.
Like Us on Facebook

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)
EE Times on Twitter
EE Times Twitter Feed
Top Comments of the Week