I'm not sure how much memory these processors can support, but it is certainly enough for a major push in the big data space. It's an impressive accompllishment from Intel, and one that ups the ante significantly in this market.
The multiple terabytes of memory on these new systems are an extraordinary leap from the largest amount of memory I'd ever heard of before (between 8 gigabytes and 64 gigabytes). I'd thought that processor address space was limited. How much addressable memory can these new processors support? This certainly will enable big data operations that were not possible before.
Sometime in the mid-90's, early Pentium Pro days, I went to a talk entitled something to the effect "X86 vs RISC" by Intel Fellow Fred Pollack. He made some interesting observations, one, that it was not totally a technology issue, it was primarily economics. On the technical side he pointed out that even then the RISC architecture was getting more complex. Also, x86 was shedding some of its ISA legacy.
He predicted most x86's architetural rivals would eventually fall, primarily because they would not be able to sufficiently invest in their architectures.
Yes, it seems as though Intel is going on the offense in order to not have to be caught in a reactive stance as ARM comes after its dominating market share in servers. This is a real threat to IBM's Power PC and Oracle's SPARC, but those companies could make some sort of move as well since Intel seems to be going after performance while ARM goes after power-saving.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.