Ah, the stuff you can do when you control your own fab process!
Intel dropped DRAM - their original product line - over 30 years ago because they were going to get clocked by the Japanese.
IBM's eDRAM really opened up single-chip architecture as eDRAM fabbed with a logic process enables an attached processor to really scream. Logic transistors are generally crap for analog, but if you own the process you can compensate.
TSVs? Gotta have 'em. As signal paths, they aren't as good as on-chip contacts, but they beat a package any day. Pretty good for cooling, too.
Once you have eDRAM and TSVs together, you can get more speed by implementing a honkin' wide bus with fewer buffer stages.
Intel is REALLY good at watching research, then incorporating results into its new devices. They (ahem) borrowed heavily from a project I worked on to create the super-wide high-speed networks in Ivy Bridge and Haswell.
The other thing to remember about Intel is that they almost certainly have stuff already working in the lab RIGHT NOW that is probably 4-5 years ahead of the commercial state of the art, and are figuring out how to manufacture stuff for the 2020 market. They are really great at maturing technologies before releasing them, allowing their investments on present-day technologies to be paid for.
I can hardly wait to see what path they'll follow to bring carbon into the market. That, I gar-on-tee, will change everything. I don't know if they'll be the first, but they'll be close.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.