Feature The explosive growth of datacenters that followed ChatGPT's debut in 2022 has shone a spotlight on the environmental impact of these power-hungry facilities.
In the US, datacenters can consume anywhere between 300,000 and four million gallons of water a day to keep the compute housed within them cool, Austin Shelnutt of Texas-based Strategic Thermal Labs explained in a presentation at SC24 in Atlanta this fall.
We'll get to why some datacenters use more water than others in a bit, but in some regions rates of consumption are as high as 25 percent of the municipality's water supply.
This level of water consumption, understandably, has led to concerns over water scarcity and desertification, which were already problematic due to climate change, and have only been exacerbated by the proliferation of generative AI. Today, the AI datacenters built to train these models often require tens of thousands of GPUs, each capable of generating 1,200 watts of power and heat.
However, over the next few years, hyperscalers, cloud providers, and model builders plan to deploy millions of GPUs and other AI accelerators requiring gigawatts of energy, and that means even higher rates of water consumption.