Conceptually, it's a great idea but remember: there ain't no such thing as a free lunch. There must be a strong economic argument for universal memories which are essentially a tradeoff as resistion said.
"Universal" is a distracting buzzword. "Tradeoff" is more appropriate. Eventually interconnect density vs. speed vs. power tradeoff dominates the memory performance just as for logic. For nonvolatile memory, also need to consider the inherent speed-retention tradefoff.
I find it silly to refer to "one universal memory" coming down the horizon. Bubbles were cool because you could actually set up optics to watch bubbles move (perhaps a testimony to their speed?).
Anyone remember Bubble Memory? It was going to replace both DRAM and the hard disk. Then there was some talk of EEPROM doing the same.
The problem is that the primary requirements for each stand on either sides of a wide divide - cheap enough for mass storage vs. fast enough for processing. By trying to do both with one part, you end up with something that does neither well.
I suppose it's logical to assume that someday, we will have a non-volatile memory technology that is both cheap enough to store massive amounts of video and fast enough to hook to a CPU, but I'm not holding my breath.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.