As far as your conclusion that Quark is all talk and no news, I'd point out that to date it's ONLY been talk. What is significant is that you might actually be able to order a Quark chip in Q4.
Sure you could order a Galileo board six months after the IDF '13 splash (although you can't find them anymore), but near as I can tell there wasn't even an order code for the chip. If you look at intel.com, you can finally find a product brief and spec sheet -- they are dated August 2014. And the Edison SD form-factor Quark only showed up the once (CES?) and if you want to buy one you are bait-and-switched to an Atom (that no longer even fits in the SD form factor). The new Internet of Things Group (IoTG) at Intel has been talking about a gateway product based on Quark all year, but the only place to find it is as a video.
Here's to getting an actual product available a year later. If it finally happens, that *is* news.
The 14 nm node is interesting as it represents a shift back to logic being the process node leader. Flash memory has held that title for several years now. However, the current 16 nm/15 nm (the "1y" node) appears to be the second to last transition with 12 nm/10 nm (the "1z" node) expected to be the last planar scale. Logic scaling does seem to have some road ahead.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.