Rumors of Apple/Nvidia devices have been crawling around the industry for a few months now. They make sense for a number of reasons.
By late 2013 the technology will be proven out by multiple vendors, so it’s not the bleeding edge stuff Apple tends to avoid for its high volume markets. It would create novel products with a performance and price premium, the kind of positioning Apple favors.
Late last year Tim Cook tipped Apple’s plans to do something really interesting in semiconductors. I think this could be it.
Nvidia announced a year ago it was working on Project Denver, a full line of ARM-based computer processors from laptops to supercomputers. Since then it has gone quiet. A partnership with Apple would be just the sort of motivation to keep lips sealed at Nvidia.
I saw Jen Hsun Huang at an industry dinner last week and asked him about Project Denver. A smile came over his face and he said, “We will have a good story to tell.”
My guess is this is that story. What do you think?
Nvidia does not have any graphics IP that can compete with SGX554MP$ in the current ipad, in terms of performance/area/power. 1st half of next year, we are likely to see Apple implement IMG's next gen graphics, rogue, which will increase processing performance considerably. Nvidia have failed to be leading edge in soc tech, despite throwing $100s millions at it, securing only a tiny market share. Just because they can technically do what you are postulating, doesn't mean they have a better overall solution to what is already out there.
Nvidia's philosophy has been to use brute force to implement graphics. In contrast, ATI (AMD now) took the other approach and arguably offered the best mobile graphics in the past. Just imagine the steam (i.e. heat) coming off the Nvidia graphics cores inside the 3D package. Unless there is a liquid nitrogen pipe going through the 3D package, or unless 5nm is made available to Nvidia and Apple assuming the current transistor count, this rumour is simple a pipe dream!
ugh. it would be so much more imaginative to use stacking to create a lower-power module containing compute and memory. imagine a smallish chip dissipating maybe 50W, but which can gluelessly be tiled with a bunch more such chips. a 1U node containing, say, 16 of these would be pretty awesome, and a data-parallel programming model like Cuda could leverage it pretty well...
I think the time for Apple or Intel to use Nvidia graphics seems over. Imagination Technologies PowerVR graphics have cornered mobile graphics. If more than 50% of Apple revenue is iphone why would they need Nvidia. Also they could have bought IT or Nvidia years ago. Why would they switch to Nvidia? The Apple chips are in gen 6. If Apple were to truly control everything it would need to make their own core and own graphics. Only buying Nvidia would give it full control. What is the major benefit?
Join our online Radio Show on Friday 11th July starting at 2:00pm Eastern, when EETimes editor of all things fun and interesting, Max Maxfield, and embedded systems expert, Jack Ganssle, will debate as to just what is, and is not, and embedded system.