It looks like a neat technological solution for Intel. I wonder how far behind IBM and other are to working production worthy silicon? Kudos to Intel! I too would like to see ARM use this technology (or similar) but more than that I am looking forward to the dual/quad 1G to 3G ARM processors.. someday.
My understanding is that the fins can be paralleled for more current switching capacity, and in the original announcement IAG said that they could swing the process to minimize the off-state leakage by 10x, which IMHO would be phenomenal.
If it's true that so less cost difference is left between the planar and tri-gate, the tri-gate should be the must-be process in 20nm or 14 nm generation, and TSMC and IBM will keep on catching up Intel in the tri-gate process, w/ non-SOI wafer.
It is much like the RISC vs CISC battle of the 1980's. The result was CISC won out in the PC and server market, while RISC took the embedded market. The key deciding factor was software, rather than hardware. The key difference 30 years later remains software. However, this time Microsoft has leveled the playing field with ARM Windows, not to mention the existing Linux compatibility.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.