I think this might be possible @resistion...Every second generation is much stronger than intermediate versions...0.35um was very strong but 0.25um was weak, then 0.18um was very strong, 90nm so, so, 65nm was probably higher runner than 45nm...etc
With the great success of 28nm technology, all the resource are pulled in for production ramp up, it is good choice to offer single process for 20nm generation. Moreover, it is required large amount of efforts for design service support, especially for LDE modeling. Therefore, it is not pure technology decision but also business consideration.
The solutions exist to continue the technology roadmap, but they bring increased layout restriction and cost. Here you see the increased layout restriction manifested in a single process offering and the increased cost manifested in widespread use of multi-pattering techniques at 14nm and below. This is exactly why 3D-IC is becoming a reality as it's high cost is now more in line with the higher cost of smaller nodes. I don't think you will see the roadmap stop as we all want increased functionality in new products. But what you might see is a slow down or even reversal in Moore's law as it has applied to the cost of consumer products. The iPad 10 may be as expensive as a small car, but you may buy it anyway.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.