Next on the list of likely/unlikely suspects is Intel, the world’s largest semiconductor provider. Though it is highly unlikely such a deal would ever pass through regulations (it would probably be slapped down before you could say “antitrust”), it would certainly give Intel a boost in graphics technology, where it has historically lagged.
It’s still highly questionable whether there would actually be any value for Intel acquiring AMD, though, as the smaller firm has not held a leadership position in CPU technology since 2006.
Apple, the world’s largest company, has the biggest war chest of cash and is under pressure from Wall Street to spend it. By buying AMD, Apple could achieve full silicon CPU/GPU independence for its MacBooks. The firm could also migrate off of Imagination Technology’s graphics core for iPad and iPhone.
Despite Apple’s predilection for owning as much technology in its supply chain as possible, however, the firm has a bit of a checkered past working with AMD, including issues with discrete graphics and lost design wins on CPU.
Microsoft is another dark horse not to be overlooked. The firm already buys AMD hardware for Xbox, using a royalty-based model, and is rumored to be working with the firm on the upcoming Xbox 720.
Also, after past issues with hardware partners, Microsoft has shown itself to be interested in taking a more vertical stance, with the recent launch of its Surface tablet. By buying AMD, Microsoft could easily assimilate the company’s technology, given its X86 expertise.
Last, and sadly probably least likely, is Nvidia, AMD’s rival graphics partner. Historically bitter enemies, ATI and Nvidia spent decades duking it out on the graphics front, but buying AMD now would give Nvidia the much needed CPU expertise it needs to challenge Intel and Qualcomm long-term.
The move would also round out Nvidia’s portfolio, boost its enterprise/HPC push and give the company a fully integrated CPU/GPU solution.
Unfortunately, aside from the massive clash of corporate culture involved, the move would likely face regulatory challenges on potential discrete graphics monopoly and the financial costs would also be too high for Nvidia to stomach.
If I was a betting girl, I’d put my money on Samsung, but it’s just as likely that none of the above will ever materialize.
What do you think, readers? Should AMD be bought? And if so, who is your money on?
"The first advantage of such an alliance, of course, would be patents. And lots of them. If Samsung really wanted to pull its punches in its frequent spats with Apple, AMD would offer the firm a."
" to pull its punches " generally means to hit less hard than is possible. Not sure how this would assist Samsung against Apple. Which is, I suppose, a " certain je ne sais quoi"!
I am interested in NVIDIA / INTEL:
How about NVIDIA buy AMD CPU department or INTEL buy AMD graphic department? It will be more interesting if both :)
Will that have any anti trust issue? probably not and it save the world!
Buying AMD is all about getting into x86 server space, few points:
- AMD will survive, Intel will have to keep them alive to stay alive. otherwise they will be single source supplier, in X86 offering. otherwise this will open up spot for ARM to get into this $50B server mkt.
- Qualcomm probably has deep pocket ($6~$8B) to get into server market, as Mobile is getting vertically integrated, they need new markets to grow, also they will be able to get/buy a fab process access.
- Intel buy will not work Antitrust etc..!
- Samsung has not anything over a Billion and 11K people with different mindset. They are all about cost optimization.
- BRCM has chance, but buried with ingesting Netlogic currently. may be 2 ~3 years out.
- Others do not make sense.
AMD needs deep pocket, Performance and Advance processes to take on Intel.
Samsung?? To think about it, They already manufacture DRAM. The DIMMs and the CPU are the two largest cost components of a server. I can easily imagine a Samsung Server with its own CPUs, Memories, they could easily pick up a NIC company and they already know how to bend sheet metal. The only reason Samsung may hesitate is because AMD's CPUs suck compared to the Sandy/Ivy bridges right now.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.