We this we are happy that the world is developing, but at the same time this will introduce one more different type of memory slot in the devices and new memory stripe. I am having many memory stripes with me but every time I buy a new computing device, I need to purchase a new memory stripe. I think the modularity is of no use. Memories should be soldered on the mother boards with options to increase memory with addition modules; this will prevent us storing the memory modules for future use.
When memory technology improves, the next generation CPUs are designed to interface to it, and the memory manufacturers shift their factories to the new memories. It is more cost effective for the factories and the CPU designers to only support one technology.
DDR4 has advantages at the high end in speed and at the low end in power consumption, so I predict we'll be seeing a lot of it next year, and less and less of DDR3 as the factories wind down its production and DDR3 ends up becoming more expensive than DDR4.
The modularity helps when you want to change the density of memory. This will be even more true with DDR4, which has multi-die devices in the spec, so DIMMs of increasing density will become available.
You may be right for other products but in case of old memory module, I do not think that this will be required by someone, as they will be useless in new systems. So there will not be any buyers for that. In case if they are reusable, it will be good as it will not be adding to e-waste.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.