Instrumented Interconnecteds Intelligent

Thomas Ludwig, Director, German Climate Computing Center; Professor of Computer Science, University of Hamburg

Thomas Ludwig, Director, German Climate Computing Center; Professor of Computer Science, University of Hamburg

By Thomas Ludwig

It’s no exaggeration to say that climate change is one of the major challenges facing mankind today. While the causes, extent and long-term impact are the subject of ongoing discussion and conjecture, the overall phenomenon is real and must be addressed.

Answering that challenge requires two things – determining how to mitigate the effects of climate change caused by human activity and learning how to adapt to our changing environment. At the German Climate Computing Center, we are dedicated to the pursuit of these two goals by providing the foremost leading environmental researchers with supercomputing capabilities to continuously run comprehensive climate simulations with coupled Earth system models and store and analyze the massive amount of data generated.

By running the largest archive of climate data in the world, we enable the sustainable use of this precious data trove for study and scientific inquiry.

Partnering with organizations such as the Intergovernmental Panel on Climate Change, the Max-Planck Institute for Meteorology and many others, we’re helping pursue a variety of analyses around climate and the environment, including:

·     Researching and reporting on the increase in the earth’s temperature and its impact on climate;
·     Simulating how and where oil spills might impact the environment;
·     Investigating how changes in airplane traffic routes and altitudes might reduce the emission of CO2 and other gases that may impact climate;
·     Producing advanced 3D models and visualizations for how we might develop smarter cities and more energy-efficient buildings;
·     Creating high-definition models of how clouds and precipitation form for more accurate predictions of weather and climate.

Of course, doing that kind of research requires the gathering and generation of a data – lots of it. Currently, our archive of climate data is roughly 40 petabytes. If we think of this in terms of DVDs, the average person might have between a half dozen to 30. A real enthusiast might have 100. By comparison, one petabyte of data is equal to 210,000 DVDs.

And as big as 40 petabytes is, the rate and pace of our climate data growth is staggering. We estimate that our archive will grow by about 75 petabytes EVERY year for the next five years. Now if you think of our supercomputing systems as a race car and the necessity to store and analyze the data as the fuel, you realize that in order for that race car to perform you have to be able to provide it with sufficient fuel very quickly. Otherwise you’ll stall at the starting line.

To get that fuel to our engine we looked to IBM for innovative technology and services, selecting the High Performance Storage System developed by IBM and the U.S. Department of Energy for our Hierarchical Storage Management system. This solution is the result of a 20-year collaborative between IBM and DOE, ideal for providing fast access to large scientific data archives like ours.

Currently we can access our data at a speedy 12 gigabytes per second, and with upgrades later this year we’ll be able to increase that by 50 percent. And as our data grows, our HSM system will be able to handle a capacity of more than 500 petabytes.

Understanding the challenges posed by climate change and other threats to our environment and finding solutions to address them will require continued scientific inquiry and discovery. That effort will be driven by big data, supported and underpinned with innovation, invention and technology like HPSS and our HSM through our partnership with IBM.

Bookmark and Share

Previous post

Next post

9 Comments
 
June 13, 2016
7:24 am

very inspiring


Posted by: Alice Ngigi
 
June 13, 2016
7:23 am

Great internet site!


Posted by: Alice Ngigi
 
June 13, 2016
7:21 am

Good.


Posted by: Alice Ngigi
 
June 11, 2016
12:40 pm

saya suka sekali infonya


Posted by: harga mobil honda semarang
 
March 7, 2016
8:23 pm

hi
mobogenie for pc


Posted by: mobogenie pc
 
March 7, 2016
8:13 pm

thanks
snapchat pc


Posted by: snapchat pc
 
March 5, 2016
9:38 am

thanks


Posted by: imo for pc
 
June 17, 2015
4:28 am

Great initiative in reference i used Google to find just how big a petabyte and guess what?, 1 Petabyte is equal to 1million gigabyte. This is very big data to work with.


Posted by: festus
 
June 10, 2015
5:06 am

keen observations


Posted by: agrieconomics
 
Post a Comment