How Much Carbon Does AI Actually Use? (And Why It’s So Hard to Find Out)
Summary
The post explains AI’s growing carbon footprint, data‑center electricity use, why estimates vary, and how companies can measure and offset AI‑related emissions.
Key quotes
Every AI query has a carbon footprint.
Data centers consumed 183 TWh in 2024, accounting for more than 4% of national electricity use.
Google's Gemini (the only model with official per‑query disclosure) reports 0.03g CO2e per median text prompt.
Inference (using AI models) now accounts for 60‑90% of total AI energy consumption.
AI queries typically use 3‑10x more energy than traditional web searches.
The article outlines the scale of data‑center electricity consumption, the shift from training‑to‑inference emissions, the lack of transparency from AI providers, and practical steps for companies to estimate, reduce, and offset AI‑related carbon emissions.