Editor’s note: The following is a guest post by Cameron Brooks, director, Smarter Water Management, IBM Big Green Innovations.
Science has made some pretty impressive advancements with water in recent years. We can, for example, harvest fog as a water source — it’s fairly effective in foggy havens like the coast of Chile. We also have the technology to turn certain groundwater contaminants (i.e. the nitrates used in fertilizer) into fuel. But for all the progress we’ve made, our water management systems are sadly lacking. Many municipalities across the U.S. lose up to 30% of their water through leaks, and for the majority of water agencies, there is no way to detect leaks preventively — residents have to quietly wait for disaster to strike before local governments repair or upgrade their utilities.
It doesn’t have to be that way, though. There’s no reason we should settle for this water loss rate given the technologies that are available today. With smart water grids, in which a sophisticated monitoring system composed of sensors, meters and data analytics are installed, we can use real-time data to dramatically reduce water loss rates. We can identify leaks in minutes, if not seconds, and make enlightened decisions about how to allocate resources. A solid IT system would require an up-front financial investment, for sure, but the alternative is to wait for a crisis situation before paying obscene sums of money to repair water lines piece by piece.