Zvi orbach mentioned a 1T sram cells being developed by one of his companies. Since most processors today are mostly made of large caches ,and since most likely this will be used for processes ouside intel (i believe) , this could greatly help intel's competitors.
Also easic had started to offer low NRE, low volume, 28nm asic manufacturing recently.Like Matthieu says, it might be good enough to compete with intel in some use cases.
Old newspaper headline trick: If the headline ends in a question mark, the answer is "No."
Frankly, it's Intel's to lose. To the extent that they continue to provide the most value, Intel well remain. Interesting to note that Intel is leading the disruptions in that they took on the ARM threat in servers head-on and got the microserver chip to market 2-years ahead of ARM offerings. And not only have they gone down-market with Atom which is more power-efficient and/or performant than ARM, they have recently introduced Quark for robotics and the Internet of Things. While it certainly behooves Google and Apple to invest in their own hardware, it doesn't seem like it will pencil out because of the tremendous volumes, and $B in capital it takes each year to be competitive in the chip business.
Multicore designs are running out of gas, given the lack of parallelism in most software. Nevertheless, "there are several really interesting opportunities for new microprocessors."
Indeed, we're still waiting to see the real benefits of those cores! Multicore platforms are tricky to program, and performance is inherently limited by a single shared memory. I think a much more promising platform is many-core with distributed memory (like Adapteva or Kalray). It will still be difficult to write manycore programs, but at least the architecture is sound.
I believe that there could be another way. If it were easier to design hardware (for example using better languages, like Cx), people could actually make their own accelerators. Then all you would need would be better FPGA architectures (using much less area and power) or a simpler, cheaper way to make ASICs. Lattice seems to be getting pretty good at low-power FPGAs, and eASIC's solution looks interesting for lower-cost ASICs. Maybe we'll get there soon? (the fact that Intel is making a hybrid Xeon-FPGA chip might be another indication)
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.