BETA RELEASE

Summary

Exploration of the environmental costs of generative AI, focusing on electricity demand, water consumption for cooling, and the carbon footprint of hardware manufacturing.

Key quotes

a generative AI training cluster might consume seven or eight times more energy than a typical computing workload
Researchers have estimated that a ChatGPT query consumes about five times more electricity than a simple web search.
Just because this is called ‘cloud computing’ doesn’t mean the hardware lives in the cloud.

The article details how the scale of generative AI models increases power density and strains electrical grids and municipal water supplies. It highlights the transition from training-heavy energy costs to the growing impact of inference as these tools become ubiquitous.