By Mark Thorsen
No matter where you look, the amount of information worldwide is exploding and the area of renewable energy is not immune. As the use and deployment of renewables grows, so too, is the amount of data the technologies surrounding these energies are generating.
Everything from solar panels to wind turbines are creating vast amounts of new data that require collection, extraction, warehousing, analysis and statistics, all to make it available in the right way.
Such functions are creating an enormous amount of information, all of which is starting to flood into utilities at a high rate. This information must be analyzed and followed up on. At the same time, more utilities are hanging onto more data than in the past, making retention and retention costs critical issues going forward.
Consider the smart meter. Typically a single device reports data at 15 minute intervals 24/7, generating 400MB of data a year. Bloomberg New Energy Finance predicts 680 million smart meters will be installed globally by 2017 – leading to 280 petabytes of data a year.
In addition to the sheer volume of data being generated, utilities are faced with the challenge of shifting energy from renewable sources onto existing grids, which are largely outdated and inefficient. By employing smart meters, utility companies are able to gather Big Data insights to find trends among the “noise” of information.
However, at the end of the day, most utilities are still left with the issue of information overload. Take Austin Energy, as an example. The electric utility covers a population of around 1 million and relies on such energy sources as nuclear, coal, natural gas and renewables to generate almost 3,000 MW of power.
Before 2003, the utility processed about 20 terabytes of data per year. In 2010, after deploying 500,000 devices (smart thermostats and meters, sensors, computers, servers and network gear) Austin Energy is gathering 100 terabytes of data (5 times more). Their smart meters deliver consumption data every 15 minutes and store it for 3 years.
In 2012, Smart Grid 2.0 was launched together with the concept and practice of real-time data. The program focuses on the grid beyond the meter with integration back to the utility grid. Smart Grid 2.0 spotlights on managing distributed generation and storage (solar rooftops, micro wind). When the readings are carried out every 5 minutes, the data amounts 144 million reads per day, reaching a capacity of 400 terabytes. With real-time data in place and transmission data management requirements, Austin Power is managing and storing 1.2 petabytes a year. That is a 60-times leap from the pre-smart grid era.
Enter the era of Meter Data Management (MDM) systems that organize and store the vast amounts of data being generated by these smart meters. As Navigant Research indicates, by 2018, 98% of all meters in North America are going to be covered by an MDM system. The global smart grid technologies revenue will grow from $44.1 billion in 2014 to $70.2 billion in 2023.
As renewables continue to become an integral part of the energy mix, MDM’s ability to manage Big Green Data will become increasingly important to understanding, exploiting and ultimately optimizing their usage.
Mark Thorsen is CEO of GreenMatch, a site that provides green energy and technology supplier comparisons and pricing.
For more on this topic, check out the IBM White Paper, Managing Big Data for Smart Grids and Smart Meters.
And remember to follow IBM Smarter Planet on Google+