The story of how NXP has re-invented itself by divesting a lot of what used to be core businesses isn't exactly easy to sell—especially among those of us who knew the Dutch chip company in its heyday, when it was still Philips Semiconductors.
I'll admit some nostalgia here. I remember, for example, when then Philips Semiconductors CEO Doug Dunn unveiled TriMedia, the company's first VLIW architecture-based processor. And it was fun listening to CTO Theo Claasen talking about next-generation process technologies, the company's "platform" strategy, and its plans for Crolles 2.
Now fast forward to NXP, which was spun out of Royal Philips Electronics and forced to fend for its own future. The newborn acronym quickly shed close to 5,000 jobs, managed its debt load, and—in August, 2010—raised nearly $450 million through an initial public offering.
Along the way, NXP got rid of its mobile/wireless group to ST-NXP Wireless, a joint venture established in 2008; NXP also sold its television systems and set-top box business lines—along with its crown jewel video IPs—to Trident Microsystems in 2009.
NXP's message to the world in the last two years was consistent: "Our new focus is on high-performance mixed-signal products."
Talk about your mixed signals! This is when many of us either got confused or just plain lost interest in NXP.
We understood the corporate line here. But what is, really, a high-performance mixed signal? What are its products? Why are they suddenly so important? How significant a business can you get from mixed signals—not just for NXP but for the rest of the semiconductor industry?
Many of us grew up covering highly-integrated digital SoCs, advanced processor architectures and next-generation multimedia applications processors for the next mobile handsets—all of which somehow conditioned us to see them as "glamorous." In contrast, high-performance mixed-signal products seemed—to put it politely—"peripheral."
During the recent Embedded Systems Conference in San Jose, I sat down for about an hour with Mike Noonen, NXP's executive VP for global sales and marketing. Noonen described NXP's new [high performance mixed-signal] strategy more effectively than anyone I've heard before. He said, in essence, "Today, NXP offers products with no big chip in the middle."
Now, I get it.
Picture a block diagram for a handset or media tablet without that big fat SoC hogging all the real estate. No big chip in the middle. Get it?
NXP, once armed with a coveted TriMedia-based solution, no longer offers a big SoC for digital TVs or set-top boxes, either.
Instead of throwing more money at the crowded, cutthroat digital SoC market, up against companies like Broadcom, Qualcomm, MediaTek and Mstar, NXP focuses now on supplying mixed-signal components that these bigwigs don't have.
Look back on NXP several years ago, when it was still bloated with a large—and nebulous—product portfolio. The company now has a sharper focus and a leaner strategy. In fact, "no big chip in the middle" should become NXP's new corporate tagline. It sums up everything that makes the company appealing to Wall Street.
But of course, as a skeptic (journalists are paid for skepticism), I wonder if doing everything but a big chip in the middle is a viable long-term strategy. Much of what you do still aims, eventually, for integration, doesn't it? That's the nature of the semiconductor business.
Good that their strategy seems to work.
I had been an ex NXP guy.. and I know for sure with my experience with so many other companies.. there is hardly a place where I found the quality of engineering as solid as Phiips Semiconductor... What killed them is the culture where one good designer would work his arse off and a myriad of others would just eat drink and make merry..
Look at the innovation that came from it...
IIC, compact disk, first pager, receiver archiecture..best ADC..so much and so much..What ever Samsung is today..is because it directly use to buy the system solution from Philips..Sad that they went down so much..the sole reason being the myriad of bullshit amid some classical european engineering.
So in general NXP just would like to build more ASSP with specially designed CPU together with necessary analog front-end but my question is how high performance it can be? With high resolution ADC and DAC (say 16 bit or higher), I wonder how the noise from the CPU be separated from the analog circuit that share the same substrate.
"The future of analog is that if you truly get and understand the signals of what you are sensing, then very focused digital signal processing can be done to extract useful information"
This seems to be what NXP is talking about with their definition of 'high performance mixed signal'. It sounds very similar to Freescale's approach to smart sensors, like the e-compass chip I mentioned above.
But like I mentioned in my earlier post, most systems do have a big chip in the middle, so if you want to offer a complete system solution to customers, you need to be able to make that big chip -- the apps processor. Something like Freescale's i.MX family, for example.
The real world is analog - not digital. Every interaction we have with it involves continuous time signals which tell us something about the physical world we are in, such as temperature, pressure,etc. Likewise, every action we take to modify the physical environment for our purposes is a continuous - but not infinite - time action.
We don't hear, smell, feel, taste, or see 'bits'.
However, you could argue that we process a great deal of information in small chunks perhaps analogous to 'bits'. For example, light impinges upon more than 100M rods and cones in the eye, all of which are necessary to discern shapes and colors of the objects we see. For smell we use millions of sensory neurons with cilia that project into the atmosphere that turn out to have receptors tailored to specific aromatic molecules or groups of molecules.
The 'big chip in the middle' syndrome comes about from the fact that they were originally designed as general purpose computation or data organizational devices.
The future of analog is that if you truly get and understand the signals of what you are sensing, then very focused digital signal processing can be done to extract useful information and, if necessary, translate that into very useful real world physical actions or analysis. The big chip in the middle becomes highly specialized to the task at hand, much like how the body/brain processes data.
By analog standards, an ARM actually IS a "big chip in the middle". Also, couldn't NXP put their energy metering chip into an SOIC package instead of that schoolbus that Noonan is holding in the picture?
Great piece. And of course they have two excellent SOI-based technologies that they build many of their chips on -- EZ-HV SOI for apps in the 60 to 650V range and ABCD SOI for Smart Power. This makes them well-placed in some very high-volume markets -- like anything with a power plug, lighting, automotive -- that are finally "going green".
Nice story. Really well written! It is odd though as it seems NXP is challenging the status quo and perhaps it even sounds pretentious. Forgetting about the main chip it seems to me that puts them in a disadvantage. I'd like to be proven wrong as I too like the company that invented I2C and the Compact Disc. I wonder why has it diverged that much?
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.