How much water does it take to train an AI?
Training a single large AI model can consume as much water as 2,500 people use in a year.
AI data centers generate intense heat and require constant cooling to function. Most facilities use evaporative cooling systems that consume millions of liters of fresh water. Training GPT-3, for example, required roughly 700,000 liters of water to keep servers from overheating.
Nerd Mode
Research led by Shaolei Ren at the University of California, Riverside, has documented the substantial water footprint of artificial intelligence. Their 2023 study, "Making AI Less Thirsty," revealed that training GPT-3 in Microsoft's state-of-the-art U.S. data centers consumed approximately 700,000 liters of fresh water—enough to manufacture 370 BMW cars or 320 Tesla electric vehicles.This cooling is essential because high-performance GPUs, such as the NVIDIA H100, generate significant heat during the intensive matrix calculations required for deep learning. Data centers typically manage this thermal load using cooling towers, where water evaporates to dissipate heat from the equipment. The water must be exceptionally pure to prevent mineral deposits and corrosion in the cooling infrastructure.Beyond the initial training phase, inference—when the AI responds to user queries—is also water-intensive. Every 10 to 50 conversations with a chatbot like ChatGPT consumes approximately 500 milliliters of water, depending on server location and seasonal conditions. As companies like Google and Microsoft scale their AI operations, global water consumption for these technologies is projected to reach 6.6 billion cubic meters by 2027.
Verified Fact
FP-0003003 · Feb 17, 2026