Most users can really save with AMD chips instead of Intel's because they dont really need Intel chips to do the jobs most of the times. IT teams often picked Intel chips just to play it safe but AMD chips can really do the jobs for less. Resellers seek to maximize revenues with higher priced Intel chips inside the throats of users.
Maybe there is a need for benchmark software to help IT guys analyze their own wares to determine if AMD chips can do the jobs every bit as good as Intel's. AMD enjoys a very low market share in the server business to make me suspect a whole lot!! at 7% of the server market just do not jive !
AMD FANS always happy: Intel's CPU share of a new high - people off the hook sooner or later x86 ARM
Intel continues to lead the process - people focus on R & D IBM
Intel must share the crown of semiconductor - SAMSUNG people make better
Intel's net harvest - people apple blue sky overhead rates
Another possibility is for AMD to support both architectures. They can support the x86 for markets such as PCs; and ARM for mobile computing and smartphones. Or augment their x86 product with an ARM processor; but that could turn out to be an expensive solution.
For ARM this is a good move, because AMD would add another licensee to their ecosystem; moreover it can help them get into computing markets where they traditionally haven’t had much traction. For AMD it makes sense because its destiny will be defined by more than what its chief competitor (Intel) does. The only caveat for AMD would be that, they would now enter a chip space with a sea of competition including Broadcom, Freescale, Infineon, Qualcomm, Samsung, ST Micro, and TI.
I think this is a great strategy idea for AMD. They will not win a head-on competition with Intel, yet they have tremendous resources and skills at their disposal. Qualcomm has done a nice job with their Arm-Licensed Snapdragon processor...optimized for mobile use. AMD could choose another tangent such as chipsets for low-cost, low-power multiprocessor servers (like SeaMicro except using ARM's instead of ATOM's).
ARM based processors lend themselves to the low current requirements for smartphones, tablets, and small laptops. In this sense, AMD might benefit by being able to sell into this rapidly expanding market. However, there are lots of competitors with their own ARM based processors, including Apple, Qualcomm, Nvidia, and Samsung. Qualcomm has years and years of experience combining its Snapdragon (ARM based processor) with its modem chips on a single chip, with smaller size, lower current draw, and other features that offer challenges to less experienced competitors. What can AMD offer to survive in this market?
This is a bad deal for AMD but good for ARM and Intel. AMD will be joining a long line of ARM licensee's all trying to out do one another with basically the same arch (ARM). AMD is currently in a market where it hold #2 (very distant but still #2) and has lots of experience at x86. They will be entering a crowed market which they have no expertice. Intel will pretty much own 100% of the x86 market at that point. ARM just gets sell yet another license.
AMD do not have to drop x86, why should they? That said, I think such deal will be more of an advantage to ARM than AMD. As for Intel, I would be very worried if I were in their shoes. I do not see increased x86 dominance as a blessing, certainly not in the mobile computing era.
I'm wondering why Nvidia is off the radar from this analysis. Its Project Denver is arguably much further down the planning and development path than anything AMD could start doing even yesterday. Also, Nvidia is claiming the entire high/low TDP range for Project Denver. Not only would it make it a much more ambitious project than what AMD could conceivably reconsider with ARM right now but it would be more complete as Nvidia already have an ARM-based portfolio.
The possibility of only one probable outcome is highly likely; AMD will most likely license ARM cpus. 2 factors help the domination of X86 programs- the amount of installed X86 software, and shrinking die size of future Fusion APU's, Intel has gotten a 30% increase of performance and battery life with every die shrink - Llano should gain, as well as the C class APU's - Monte Carlo simulations and simultaneous HD at the same time, on a tablet or netbook?
AMD could extend its instruction decoder to support ARM instructions in addition to x86, and it may not even cost much die space. while this might not yet save lots of power, it would be a preparation step away from x86. over time more and more ARM-only cores can be added to the CPU. also, if done right, mixed-mode code could be made possible without help from microsoft. AMD should do it simply because intel was stupid enough to sell its ARM architecture license - a serious strategic mistake.
Actually, this is a new fact in the ARMx Intel battle.
For AMD is also a good strategy, but they don´t to finish working with x86 products. They can have both. So they will enter in the Qualcomm, TI, NVidea, Marvel market.
Of course, with the experience they have from the x86 market, they will be able to offer very interesting products that will face Intel.
Let me get this right --- the suggestion is that ARM abandons a market where they seem to have a 20% share and relatively stable financial situation (when averaged over longer periods) and go to a chip market where nobody but a few handset makers are making a profit and where instead of being a me-too in the a field of two they would be me-too in a field of 10's. Does anyone really think that TI and Samsung and Qualcomm are just going to roll over? Even if AMD instantly gets 20% of that ecosystem they will still make less money than with 20% of the PC share. And let's not exaggerate the importance of the availability of Windows on the ARM architecture. Windows has been available on Itanium since day 1 and I don't think that anyone is crediting it with the "smashing success" of that particular platform.
Now I see why AMD's CEO. Dirk Meyer leave ... doing business in corporate enterprise market (Opteron: server, HPC) is diff from end-user consumer market (ARM: elec gadget). It'd like asking a traditional dance teacher to dance in the nightclub. But the decision moving to ARM sound logical since Microsoft is also joining the party.
P/S: With exception of new Brazo platform, I think main revenue from AMD's CPU processor did come from server rather then desktop/notebook.
Can Radeon GPUs be made compatible with ARM? AMD sold ATI's Imageon GPU technology that was made to measure for ARM, and Qualcomm are now exploiting it as Adreno very aggressively and effectively. So if AMD take ARM licenses, what do they do for GPUs.
License back from Qualcomm? Yuk!
Start from scratch? Yuk!
License Mali? Yuk?
License ARM architecture including Mali? Hmm.
That would allow them to leverage their existing skills to differentiate their products.
Perhaps they could work together with ARM on future GPUs as nVidia say they are doing with CPUs.Just a thought.
Right, there’s no story here. ARM is always seeking another design compliment; Intel or otherwise. Trade here, exchange there, just another license. AMD abandon x86; never. $100,000 wafers for $4,000 wafers; not likely. Establish parity position in ARM cluster; possibly. Unique & differentiated; hard to say. Shadow of Intel; stepped on & dumped on & infiltrated & dismantled by Intel over, & over, & over again. Playing catch up; and that has something to do with being stepped on & dumped on; over & over & over again. Definer of x86 architecture and always playing catch up; Socket 7, first to 1 GHz, Hyper transport, 1st to 64 bit x86, first over 90 nanometer hurdle. Intel always uses others for their most risky prototypes. Code compatibility at a better price, sure, but only when Intel dumps below cost. Under pressure; more like forced under. From scaling down to scaling out; know Linux? And if only multiprocessing that’s more than distributed could make all those installed processors work meaning no more processors to sell; what would sales think? This margin; you do understand the difference between $100,000 wafers and $4,000 wafer don’t you. What’s not making sense here? 64 bit Windows efficient; bloat, bloat. Oh Samsung, you must mean the good Intel? Fusion dev conference; I’m wondering too? One always has to explain too Microsoft, exactly, why not combining with Intel makes financial sense. Yea, tablets, if the board said to anyone make meager processors below cost I’d be upset with what that did to employee MBO’s and enterprise profitability too. Destroy XEON, then you can afford dabble in tablet. ARM; maybe? But only if those wafers are worth a whole lot moore. x86 not worth second sourcing; are you out of your mind? Value of first source chips pays for everything else. Licensing x86 would essentially limit ARM to their little $600 million dollar island. Wake up.
AMD has long experience selling microprocessors into desktop, server and notebook computers. This is something which ARM has never been able to do because of a lack of Windows, until now.
Could AMD help ARM power up Windows-based computers?
Intriguing link @Yankiwi...there is an interesting analogy between Intel, Nokia, Microsoft etc. as large companies that once dominated their markets but now struggling to maintain their dominance...but they still have billions of dollars in the bank so you can't write them off just like that....Arm, Apple and Google might struggle one day too, this is a cycle of good capitalism that is healthy for all of us...Kris
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.