Multicore designs are running out of gas, given the lack of parallelism in most software. Nevertheless, "there are several really interesting opportunities for new microprocessors."
Indeed, we're still waiting to see the real benefits of those cores! Multicore platforms are tricky to program, and performance is inherently limited by a single shared memory. I think a much more promising platform is many-core with distributed memory (like Adapteva or Kalray). It will still be difficult to write manycore programs, but at least the architecture is sound.
I believe that there could be another way. If it were easier to design hardware (for example using better languages, like Cx), people could actually make their own accelerators. Then all you would need would be better FPGA architectures (using much less area and power) or a simpler, cheaper way to make ASICs. Lattice seems to be getting pretty good at low-power FPGAs, and eASIC's solution looks interesting for lower-cost ASICs. Maybe we'll get there soon? (the fact that Intel is making a hybrid Xeon-FPGA chip might be another indication)
Old newspaper headline trick: If the headline ends in a question mark, the answer is "No."
Frankly, it's Intel's to lose. To the extent that they continue to provide the most value, Intel well remain. Interesting to note that Intel is leading the disruptions in that they took on the ARM threat in servers head-on and got the microserver chip to market 2-years ahead of ARM offerings. And not only have they gone down-market with Atom which is more power-efficient and/or performant than ARM, they have recently introduced Quark for robotics and the Internet of Things. While it certainly behooves Google and Apple to invest in their own hardware, it doesn't seem like it will pencil out because of the tremendous volumes, and $B in capital it takes each year to be competitive in the chip business.
Zvi orbach mentioned a 1T sram cells being developed by one of his companies. Since most processors today are mostly made of large caches ,and since most likely this will be used for processes ouside intel (i believe) , this could greatly help intel's competitors.
Also easic had started to offer low NRE, low volume, 28nm asic manufacturing recently.Like Matthieu says, it might be good enough to compete with intel in some use cases.
Apple/Google/Amazon/etc. may be able to spend a bunch of $$$$$ to save a few $$ for themselves. Who else would use their custom CPU? Given the tremendous investment required to maintain the processor infrastructure (compilers, memory interfaces, I/O hubs, annual product refresh, etc.), I can't see this being anything more than an attempt to get Intel to lower their prices.
Google and others must do what they do best - and it isn't making CPUs. They may choose a different CPU company that better meets their needs, but I can't imagine there ever being a benefit of becoming a CPU company.
The biggest threat to Intel's CPU business are the likes of Janet Reno...
I would not trust much the opinion of someone with Transmeta credentials regarding microprocessors. Intel's strength is not in microprocessor architecture but in semiconductor processing, in which it is by 2.5-3 generations ahead of the next guy. Google would be really dumb trying to take on Intel in microprocessors. It is VERY far away from their core competence. Charlie Sporck was one of the greats of the semiconductor industry and he lost his job at Nat Semi because he tried and failed to take on Intel in microprocessors. The Nat Semi processor was MUCH better architecturally than the x86 architecture (and it was optimized to run Unix) but Intel easily won because Nat Semi could not compete in IC manufacturing.
I dont think Google or Apple can disrupt INtel. Google being a ppredominantly web company can try in hardware but being successful or even being in the market would be a challenge. Hardware is a different business it doesnt work like web. They may scratch for some time and leave it. Apple also may not be go very long in hardware design.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.