Major droughts from California to South Africa have driven awareness of water scarcity. Many organizations are measuring, managing and reporting on their direct water consumption to address drought risks. However, there is not yet an accepted and easy-to-implement methodology for quantifying unseen, upstream consumption of water. With increasing demands for water alongside dwindling supplies—due to climate change, drought and the competing demands from industrial, agricultural and municipal water users—it is important to understand the demands that energy-intensive activities place on water resources. The water-energy nexus is a useful lens for analyzing policy or resource management in a specific geography.
While there are many definitions of the water-energy nexus, one version highlights the relationship between the upstream water consumption (e.g., water pumped into a well for oil extraction) and water used during electricity generation (e.g., water evaporated from cooling towers). Every form of electricity generation—even renewables like solar and wind—requires water. As a result, downstream water availability and quality is often impacted by upstream energy production.1 These impacts are not only caused by direct water consumption during resource extraction and electricity generation, but also by indirect means such as contamination of water following extraction (e.g., from coal mining tailings).
To date, much focus has been on the greenhouse gas (GHG) emissions from electricity production, while less attention has been paid to the amount of water required to produce electricity, despite electricity generation withdrawing more water in the United States than any other use category. The direct water impacts from electricity production, combined with the indirect impacts of climate change in altering water availability, makes electricity a powerful water topic.