AI Model Training water consumption estimate
GPT-3 training ≈ 5.4 million liters of water (combined on-site cooling + off-site electricity), based on Li et al. (2023) assumptions for Microsoft U.S. data centers.
Source: Li et al. 2023 — Making AI Less “Thirsty”
AI Water Impact Calculator
Estimate water used for a single LLM query (cooling + electricity)
Enter your typical ChatGPT question
Assumptions (editable)
Energy per 1K tokens (kWh)
i
PUE
i
On-site WUE (L/kWh)
i
Off-site EWIF (L/kWh)
i
Reset assumptions
Results
Estimated tokens
62 tokens
IT energy
0 kWh
On-site cooling
0.31 mL
Off-site electricity
1.153 mL
Total water for this query
1.463 mL
0.003x 500 mL bottles
If 1 million users ran similar query (as of 2025 chatGPT projected to have 700 Million users)
Total water
1,463.2 L (1.463 m³)
2,926.4x 500 mL bottles
Use AI responsibly
Practical ways to reduce the water footprint of your AI usage:
• Choose smaller/faster models when they meet your needs.
• Batch requests and avoid repeated runs of the same query.
• Refine prompts to be concise and to the point.
• Cache and reuse results where possible.
• Turn off streaming or long outputs unless necessary.
Notes, references, and assumptions
This is a rough estimate combining on-site cooling water and off-site electricity water. Real values vary by data center cooling type, weather, grid mix, and model size.
Li et al. (2023) Making AI Less “Thirsty” — methodology and example figures
HTML version with details on WUE/EWIF and per-request estimates
Meta Sustainability: data center water usage and WUE figures
Google Environmental Reports: water metrics and WUE context
IEA Water–Energy Nexus (context on electricity water intensity)
We use cookies to enhance your experience. By continuing to visit this site you agree to our use of cookies.
Accept
© 2025 Blue Impact project. All rights reserved.