The battle is not over yet. I remember Apple and Microsoft had a battle before. Every engineer knew that Apple's OS is better than Microsoft then. But MS had larger market share and was popular. So, Apple was almost dead. But, guess what? They resurrected by new innovations and now ...
This is Sad story with an Unexpected Ending, Please let me describe why.
Nowadays the Semiconductor industry entered the efficiency phase and Power Efficiency is one of the most important asset.
MIPS and SPARC ware the pioneers of RISC architecture. RISC Architecture in average uses a bit less power to execute an instruction.
So RISC processor’s using the same technology and same level of compiler will use a bit less power. More’s Law has given us the ability to put more (switches) transistors in the same area and better architectures both for the instruction set and for the power saving are giving us the ability to use those transistors (switches).
The problem of MIPS in my opinion stems from the focus on markets (Workstations, TV SOC, MPEG, STB’s wireless routers and switches) where they had no sustainable advantage and the Power Efficiency advantage was less important. The lack of focus on Application Processors and Android based mobile devices was what killed MIPS’s growth potential.
This division of the assets of MIPS looks very unusual and to put it in perspective using World War II comparison looks like Germany (MIPS) being divided between the Soviets and the Western alliance.
I would endeavor to guess that ARM and Imagination are looked in a mortal combat to be the IP that will reign supreme in portable devices with ARM also hedging for Data Centers.
What say you ??
Most non-engineering people today will be surprised to learn that ARM (Advanced RISC Machine) is a design spec out of Britain. This spec was a follow-on to simpler architectures like 6800 from Motorola and 6502 from MOS Technology. While some people considered these chips "CISC", by other definitions (like single instruction per clock) they were actually "RISC" which is why they seemed so powerful in early personal computers like the Commodore PET and the Apple 2. But these inexpensive chips did complicated things (like floating point) in software and so American companies pushed for CISC-like successors already found in American minicomputers like HP-3000, System-36, PDP-11 and VAX. We really see this CISC-thing get out of control with chips like Intel's Pentium line where streaming architectures (MMX, SSE, SSE2, etc.) do DSP under the acronym of SIMD (single instruction, multiple data). It was at this time you saw some American companies switch back to expensive "pure RISC" chips like SPARC (SUN), PA-RISC (HP), Alpha (DEC, developed after doing business with MIPS), POWER (IBM). Why? Because computing professionals already knew that RISC actually meant "Relegate Important Stuff to Compiler". The expensive RISC chips all died for one reason or another leaving the ARM design being the only one standing when companies wanted to expand from phones to smart-phones and pads. I find it slightly amusing that more of this technology is moving from the cowboy culture of America back to a more business-like culture of Britain.
Well, what else could they say? We bought this biz and we will kill it now??? Large number of companies in the last 20 years said that will keep the technology but quietly or not so quietly killed it later on
Sure that is what they would say but what is the long term viability of that approach. If MIPs could not survive on its own how will imagination push MIPS designs?
I think this is a short term strategy to continue supporting existing customers so they don't get sued by them for violating support contracts.
Imagination CEO also talked about a lot of similarities between its Meta CPU and that of MIPS.
The follow-up analysis story, discussing the promise by Imagination that it won't kill MIPS, is posted here:
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.