While this will revolutionise the "Big Data" space, I don't think the idea is that revolutionary, more an idea who's time has come.
We used to think that the 68000 CPU was such an amazing thing compared to the 6800, but really when the 6800 was developed they couldn't really put many more transistors onto a die than would make a 6800. IBM was already making large clunkers with bucket loads of discrete logic chips and I remember around the time that the 8080 came out NEC shipped a 16bit dual processor with megabytes of memory the extent of which simply wouldn't fit into a chip.
All I'm saying is most of these things are "this is the amount of resources we have now, what can we do with it."
I mean we've known for decades where the bottlenecks are, we just couldn't do anything about it.
Don't get me wrong, kudos to NVidia for taking the first brave step but it's not unknown technology. We always cross the bridge when we come to it.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.