Ever sense that report from the Apple reorg about their "ambitious" semiconductor plans, I've been wondering what that meant.
This is a reasoned take on it.
I am wondering if Nvidia might have some role in this play, too. Project Denver, where are yooooouuuu?
In no particular order:
A quick search on LinkedIn for "Binary Translation" (BT) shows something of a mass exodus of such specialists from Nvidia within the last two years. BT was (is?) the essential enabler for "Project Denver"
Hey Dylan, Thanks for leaving the key word: 2017. If you are going to copy an article or refer an article, at least understand the gist of it, before throwing it up all over here. You and Rick make a nice pair. Not fit to be a reporter let alone a Tech one.
Im sure the guys at PA is really brilliant and many of them would probably work their ass of to beat intel at their home turf. But not only do they have to design an architecture (ARM based?) that is better than what intel has shown us today, they also need to beat whatever intel has been up to the last few years while loitering ahead of AMDs floundering attempts to catch up. Also whatever they cook up has to beat intels node advantage, no small thing in it self. And they need to do this without stepping on any intel patents. That is a tall order, even with Intrinsitys secret sauce.
There's a tradeoff; Apple has to pay ARM for the architectural license and possibly per-chip royalties depending on the licensing agreement.
Apple's competition also pays licensing fees (Qualcomm, Samsung,
Going with Intel enables Apple to retain Intel's very generous volume/courtesy discounts on the silicon while at the same time leveraging Intel's large driver and compiler teams.
This in turn allows Apple to focus on innovating in other areas; OS/Middleware/Applications while at the same time improving margins and/or enabling Apple to price their products more competitively.
The "architecture" (x86 vs ARM) is irrelevant. x86 is actually a disadvantage, but a minor one, as all Intel chips since the Pentium immediately translate x86 into an ARM like RISC ISA.
What's left is microarchitecture and circuit technology. Of these, Intel currently leads in absolute performance per core, while ARM leads in performance per watt, which translates into throughput performance.
The node advantage is expected to reduce going forward if not go away. Smaller technology shrinks (22nm-14nm) are not providing the dramatic reduction in power or increase in frequency anymore. So if the Apple design team can get close to intel performance with better integration of other functions, the management would take that and get a huge improvement in margins.
Remember that Apple is already pretty much vertically integrated already.
Integration is really the key challenge and it's unclear that Apple can or wants to do that. Intel's recent Digital RF on CMOS Atom SOC ("Rosepoint") is the first of its kind; no one else has demonstrated that level of integration on a modern process tech node.
Apple are looking at how they can continue to differentiate their product to keep a wack of profit. If the consumer realises that all a Mac is, is a computer the same as a Sony or Lenovo, they won't pay the premium. The differentiatr is needed to continue to deliver $8bn per quarter that feeds its stock price.......
Aren't 75% of Apple revenue from iPhone/iPad?
Isn't Samsung the biggest competior of Apple in smartphone and tablet?
Isn't Samsung manufacturing and helping design SOCs for Apple?
Who would going to manufacture this ARM processor for iMac? Most likely Samsung.
Then, what would do Apple good?
" If the consumer realises that all a Mac is, is a computer the same as a Sony or Lenovo, they won't pay the premium."
Which has basically never been the case in the history ~40 year history of Apple.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.