Good that their strategy seems to work.
I had been an ex NXP guy.. and I know for sure with my experience with so many other companies.. there is hardly a place where I found the quality of engineering as solid as Phiips Semiconductor... What killed them is the culture where one good designer would work his arse off and a myriad of others would just eat drink and make merry..
Look at the innovation that came from it...
IIC, compact disk, first pager, receiver archiecture..best ADC..so much and so much..What ever Samsung is today..is because it directly use to buy the system solution from Philips..Sad that they went down so much..the sole reason being the myriad of bullshit amid some classical european engineering.
So in general NXP just would like to build more ASSP with specially designed CPU together with necessary analog front-end but my question is how high performance it can be? With high resolution ADC and DAC (say 16 bit or higher), I wonder how the noise from the CPU be separated from the analog circuit that share the same substrate.
"The future of analog is that if you truly get and understand the signals of what you are sensing, then very focused digital signal processing can be done to extract useful information"
This seems to be what NXP is talking about with their definition of 'high performance mixed signal'. It sounds very similar to Freescale's approach to smart sensors, like the e-compass chip I mentioned above.
But like I mentioned in my earlier post, most systems do have a big chip in the middle, so if you want to offer a complete system solution to customers, you need to be able to make that big chip -- the apps processor. Something like Freescale's i.MX family, for example.
The real world is analog - not digital. Every interaction we have with it involves continuous time signals which tell us something about the physical world we are in, such as temperature, pressure,etc. Likewise, every action we take to modify the physical environment for our purposes is a continuous - but not infinite - time action.
We don't hear, smell, feel, taste, or see 'bits'.
However, you could argue that we process a great deal of information in small chunks perhaps analogous to 'bits'. For example, light impinges upon more than 100M rods and cones in the eye, all of which are necessary to discern shapes and colors of the objects we see. For smell we use millions of sensory neurons with cilia that project into the atmosphere that turn out to have receptors tailored to specific aromatic molecules or groups of molecules.
The 'big chip in the middle' syndrome comes about from the fact that they were originally designed as general purpose computation or data organizational devices.
The future of analog is that if you truly get and understand the signals of what you are sensing, then very focused digital signal processing can be done to extract useful information and, if necessary, translate that into very useful real world physical actions or analysis. The big chip in the middle becomes highly specialized to the task at hand, much like how the body/brain processes data.
By analog standards, an ARM actually IS a "big chip in the middle". Also, couldn't NXP put their energy metering chip into an SOIC package instead of that schoolbus that Noonan is holding in the picture?
Great piece. And of course they have two excellent SOI-based technologies that they build many of their chips on -- EZ-HV SOI for apps in the 60 to 650V range and ABCD SOI for Smart Power. This makes them well-placed in some very high-volume markets -- like anything with a power plug, lighting, automotive -- that are finally "going green".
Nice story. Really well written! It is odd though as it seems NXP is challenging the status quo and perhaps it even sounds pretentious. Forgetting about the main chip it seems to me that puts them in a disadvantage. I'd like to be proven wrong as I too like the company that invented I2C and the Compact Disc. I wonder why has it diverged that much?