WASHINGTON – “Big data” applications like analytics software are expected to reshape the way U.S. power utilities operate and could transform the way the smart grid is rolled out, a market study concludes.
Brooklyn-based GTM Research forecasts that utilities will spend $20 billion over the next nine years on data analytics used to track grid operations and consumer power usage. That works out to an estimated $100 per household over the same period.
Utility spending on data tracking, which is expected to reach $1.1 billion next year, is forecast to reach $3.8 billion by 2020.
“With the influx of big data, the potential of smart grid has shifted dramatically from the original aim of adding a myriad of new devices toward a complete re-invention of the way utilities do business,” said GTM’s Rick Thompson.
Data analytics software allows utilities to track and predict grid operations and power consumption. The capabilities have come to be called the “soft grid.”
Among the big data technologies driving the soft grid are open-source platforms like Hadoop and big data “appliances” enabled by massively parallel processing capabilities.
Among the 60 vendors GTM looked at, key players in the emerging soft grid market include network infrastructure and storage specialists like Cisco Systems and VMWare.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.