Instrumented Interconnecteds Intelligent
October, 15th 2014
10:53
 

Mark Thorsen, CEO, GreenMatch

Mark Thorsen, CEO, GreenMatch

By Mark Thorsen

No matter where you look, the amount of information worldwide is exploding and the area of renewable energy is not immune. As the use and deployment of renewables grows, so too, is the amount of data the technologies surrounding these energies are generating.

Everything from solar panels to wind turbines are creating vast amounts of new data that require collection, extraction, warehousing, analysis and statistics, all to make it available in the right way.

Such functions are creating an enormous amount of information, all of which is starting to flood into utilities at a high rate. This information must be analyzed and followed up on. At the same time, more utilities are hanging onto more data than in the past, making retention and retention costs critical issues going forward.

Consider the smart meter. Typically a single device reports data at 15 minute intervals 24/7, generating 400MB of data a year. Bloomberg New Energy Finance predicts 680 million smart meters will be installed globally by 2017 – leading to 280 petabytes of data a year.

In addition to the sheer volume of data being generated, utilities are faced with the challenge of shifting energy from renewable sources onto existing grids, which are largely outdated and inefficient. By employing smart meters, utility companies are able to gather Big Data insights to find trends among the “noise” of information.

However, at the end of the day, most utilities are still left with the issue of information overload. Take Austin Energy, as an example. The electric utility covers a population of around 1 million and relies on such energy sources as nuclear, coal, natural gas and renewables to generate almost 3,000 MW of power.

Before 2003, the utility processed about 20 terabytes of data per year. In 2010, after deploying 500,000 devices (smart thermostats and meters, sensors, computers, servers and network gear) Austin Energy is gathering 100 terabytes of data (5 times more). Their smart meters deliver consumption data every 15 minutes and store it for 3 years.

SP Smarter Meter 1In 2012, Smart Grid 2.0 was launched together with the concept and practice of real-time data. The program focuses on the grid beyond the meter with integration back to the utility grid. Smart Grid 2.0 spotlights on managing distributed generation and storage (solar rooftops, micro wind). When the readings are carried out every 5 minutes, the data amounts 144 million reads per day, reaching a capacity of 400 terabytes. With real-time data in place and transmission data management requirements, Austin Power is managing and storing 1.2 petabytes a year. That is a 60-times leap from the pre-smart grid era.

Enter the era of Meter Data Management (MDM) systems that organize and store the vast amounts of data being generated by these smart meters. As Navigant Research indicates, by 2018, 98% of all meters in North America are going to be covered by an MDM system. The global smart grid technologies revenue will grow from $44.1 billion in 2014 to $70.2 billion in 2023.

As renewables continue to become an integral part of the energy mix, MDM’s ability to manage Big Green Data will become increasingly important to understanding, exploiting and ultimately optimizing their usage.
_______________________________

Mark Thorsen is CEO of GreenMatch, a site that provides green energy and technology supplier comparisons and pricing.
_______________________________

For more on this topic, check out the IBM White Paper, Managing Big Data for Smart Grids and Smart Meters.
____________________________

And remember to follow IBM Smarter Planet on Google+

Bookmark and Share

Previous post

Next post

5 Comments
 
April 29, 2016
1:50 pm

good job. thanks


Posted by: سئو سایت
 
April 29, 2016
1:39 pm

Tnx very much


Posted by: خرید کریو
 
March 6, 2016
3:27 am

At the same time, more utilities are hanging onto more data than in the past, making retention and retention costs critical issues going forward.


Posted by: Snapback Caps
 
January 29, 2016
10:19 am

We live in the would of data bases. I agree that data room service based on cloud technology would be the best solution in this exact case.


Posted by: Stephan Lane
 
February 13, 2015
12:04 pm

Cloud data could be quite useful in this scenario.


Posted by: Flow Switch
 
2 Trackbacks
 
December 2, 2015
6:34 pm

[…] New Energy Finance estimates that the world’s 680 million smart meters generate will generate 280 petabytes of data by 2017. To put it in perspective, that’s close to the amount of data that all Facebook users […]


Posted by: Six Ways IoT Will Impact Data Centers
 
October 16, 2014
9:00 am

[…] article first appeared on October 15, 2014 on IBM Smarter Planet and was republished with […]


Posted by: Managing Big Green Data | Longitudes
 
Post a Comment