Breaking News
Blog

Will Google or Apple Disrupt Intel?

NO RATINGS
View Comments: Newest First | Oldest First | Threaded View
<<   <   Page 3 / 3
alex_m1
User Rank
CEO
1T sram
alex_m1   8/11/2014 3:22:21 PM
NO RATINGS
Zvi orbach mentioned a 1T sram cells being developed by one of his companies. Since most processors today are mostly made of large caches ,and since most likely this will be used for processes ouside intel (i believe) , this could greatly help intel's competitors.

 

Also easic had started to offer low NRE, low volume, 28nm asic manufacturing recently.Like Matthieu says, it might be good enough to compete with intel in some use cases.





Some Guy
User Rank
CEO
If the headline ends in a question mark ...
Some Guy   8/11/2014 3:13:58 PM
Old newspaper headline trick: If the headline ends in a question mark, the answer is "No."

 

Frankly, it's Intel's to lose. To the extent that they continue to provide the most value, Intel well remain. Interesting to note that Intel is leading the disruptions in that they took on the ARM threat in servers head-on and got the microserver chip to market 2-years ahead of ARM offerings. And not only have they gone down-market with Atom which is more power-efficient and/or performant than ARM, they have recently introduced Quark for robotics and the Internet of Things. While it certainly behooves Google and Apple to invest in their own hardware, it doesn't seem like it will pencil out because of the tremendous volumes, and $B in capital it takes each year to be competitive in the chip business.

DouglasMotaDiasDSc
User Rank
Freelancer
Nvidia abandons 64-bit Denver chip for servers
DouglasMotaDiasDSc   8/11/2014 1:16:18 PM
It seems that Denver Project is dead... :-(

Nvidia cancels plans to develop a 64-bit ARM chip for servers as questions linger about the viability of such products

http://www.computerworld.com/s/article/9249291/Nvidia_abandons_64_bit_Denver_chip_for_servers

Matthieu Wipliez
User Rank
Manager
Designing custom accelerators
Matthieu Wipliez   8/11/2014 10:09:11 AM
NO RATINGS
Multicore designs are running out of gas, given the lack of parallelism in most software. Nevertheless, "there are several really interesting opportunities for new microprocessors."

Indeed, we're still waiting to see the real benefits of those cores! Multicore platforms are tricky to program, and performance is inherently limited by a single shared memory. I think a much more promising platform is many-core with distributed memory (like Adapteva or Kalray). It will still be difficult to write manycore programs, but at least the architecture is sound.

I believe that there could be another way. If it were easier to design hardware (for example using better languages, like Cx), people could actually make their own accelerators. Then all you would need would be better FPGA architectures (using much less area and power) or a simpler, cheaper way to make ASICs. Lattice seems to be getting pretty good at low-power FPGAs, and eASIC's solution looks interesting for lower-cost ASICs. Maybe we'll get there soon? (the fact that Intel is making a hybrid Xeon-FPGA chip might be another indication)

<<   <   Page 3 / 3
August Cartoon Caption Winner!
August Cartoon Caption Winner!
"All the King's horses and all the KIng's men gave up on Humpty, so they handed the problem off to Engineering."
5 comments
Top Comments of the Week
Like Us on Facebook

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)
EE Times on Twitter
EE Times Twitter Feed
Flash Poll
Radio
LATEST ARCHIVED BROADCAST
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.