By Harry Kolar
New York’s Lake George is a pristine, 32-mile-long lake in the Adirondack Mountains that is noted for its water quality and clarity. While the lake is very clean, it faces multiple anthropogenic threats, including road salt incursion and several invasive species.
The Jefferson Project at Lake George, a joint research collaboration involving Rensselaer Polytechnic Institute, IBM Research, and the FUND for Lake George, is focused on protecting the lake and helping address the world’s looming freshwater supply challenges.
The project involves more than 60 scientists around the world (four IBM Research labs are involved), including biologists, computer scientists, physicists, engineers and chemists. Working as a virtual team, we’re pushing the boundaries in Internet-of-Things sensors, data analytics, and modeling of complex natural systems.
The Jefferson Project, named after Thomas Jefferson, who admired the lake, was launched 1 ½ years ago. But now that we have dozens of sensors deployed and powerful computer systems set up, we are beginning to analyze rich streams of data and to use that data to refine our multiple computer models of the workings of the lake and its watershed.
I have spent the last 10 years studying water, including major projects in New York’s Hudson River and Ireland’s Galway Bay, but at Lake George we are producing even more advances than ever at an impressive pace.
For starters, we created a state-of-the-art observational system–including sensors deployed on the lake bottom, on floating platforms and in feeder streams. Some of the sensors are “smart.” For instance, our floating platform is capable of detecting when the weather is changing and adjusts the cadence of its monitoring activities so that we can better capture certain events.
The computer modeling system is cutting edge, as well. Typically, scientists create discrete models for different elements of an ecosystem. In this case, we are coupling them–the way nature works. Our weather model feeds into the model that predicts run-off from storms, which feeds into the salt model, which feeds into the lake circulation model, which feeds into the food-chain model. Using data from sensors, we will be able to validate and continuously refine our models over time as we add new sensors and measurement data to the information platform.
The end goal is to be able to run simulations to help predict how events such as heavy storms, road salt run-off, and introduction of new plant or animal species would likely affect the entire natural environment. It’s a holistic approach to ecosystem analysis and management.
What if the state or local department of transportation uses more or less salt on a road within the Lake George watershed, or changes its chemical recipe? What if a storm washes out a new area of land and deposits sediment into the lake? Armed with more and better knowledge from the data and analyses, pubic officials, business leaders and citizens will be able to make better decisions that could reduce harm to the lake.
It’s an exciting time for our research team. In the coming months, we’ll gather huge amounts of data,, deploy new sensors and perform experiments to help us understand how the lake is changing. (Already, we know from past monitoring efforts, chloride inputs from road salts have tripled, algae has increased 33% and five invasive species have been introduced over the past 35 years.)
Eventually, we’ll commercialize the technology–taking on problems in other lakes, rivers, estuaries, coastal areas, ports, and the oceans. Today, large swaths of the globe are facing water crises that already threaten the health of individuals and the economies of nations. Fresh water challenges will continue to spread and disrupt more lives. It may turn out that the research we’re doing now to keep Lake George pristine may be critical to the sustainability of the planet and the survival of people, countless animals and plants living on it. This is the kind of research a scientist wants to work on.