Many of these new 'IoT' applications will require a low cost, low power, small footprint, etc. Many designers may find out that jumping to the latest silicon node (20nm, 14nm, etc) will have exorbitant costs and realize that 2.5D packaging that allows mixing of various silicon technologies can be more cost effective, lessen risks and shorten development time.
Part of the cost will be the width of memory to support the processor/controller used. Quickly moving away from 64 bit down to 8 bit (as long as the 8 bit can perform the job), will have huge cost impact. In addition, puchasing an 8 bit KGD and using in 2.5D package rather than taping out a complex design at TBD nm, can save even more money. Less development $s will lower the required units sold to break even therefor improving ROI.
The architect will need to do various tradeoffs between a single chip (at a large silicon node) vs. using a 2.5D approach. The cost, performance, power, etc requirements might be achieved with a single IC in 65nm or 130nm process. But without examining the other alternatives available, long held assumptions might mislead to the wrong solution (cost, risk, etc). As an example, a working PHY that has been created and silicon certified in 90nm might be a much lower risk than trying to integrate this PHY into a new IC chip. Using the 90nm PHY (as it was certified) is already proven. Integrating into a new IC will require a new certification with USB-IF or SATA or PCI Express or....industry bodies that perform this certification.
I can see the huge benefit of using 8 bits MCU for IoT nodes in hardware. I'm not too sure about in the software. I don't believe you can run Linux in a 8 bits MCU and therefore, it makes life a little bit difficult. Well! I assume Linux is a preferable OS for embedded application. Nonetheless, for IoT nodes, given the power and size constraint, I don't think running Linux is prefered.