While this will revolutionise the "Big Data" space, I don't think the idea is that revolutionary, more an idea who's time has come.
We used to think that the 68000 CPU was such an amazing thing compared to the 6800, but really when the 6800 was developed they couldn't really put many more transistors onto a die than would make a 6800. IBM was already making large clunkers with bucket loads of discrete logic chips and I remember around the time that the 8080 came out NEC shipped a 16bit dual processor with megabytes of memory the extent of which simply wouldn't fit into a chip.
All I'm saying is most of these things are "this is the amount of resources we have now, what can we do with it."
I mean we've known for decades where the bottlenecks are, we just couldn't do anything about it.
Don't get me wrong, kudos to NVidia for taking the first brave step but it's not unknown technology. We always cross the bridge when we come to it.
Replay available now: A handful of emerging network technologies are competing to be the preferred wide-area connection for the Internet of Things. All claim lower costs and power use than cellular but none have wide deployment yet. Listen in as proponents of leading contenders make their case to be the metro or national IoT network of the future. Rick Merritt, EE Times Silicon Valley Bureau Chief, moderators this discussion. Join in and ask his guests questions.