We did the math on AI’s energy footprint. Here’s the story you haven’t heard.
Summary
An analysis of the energy and carbon footprint of AI, detailing the costs of inference versus training and projecting future demand as AI agents proliferate.
Key quotes
In 2017, AI began to change everything. Data centers started getting built with energy-intensive hardware designed for AI, which led them to double their electricity consumption by 2023.
It’s now estimated that 80–90% of computing power for AI is used for inference.
by 2028 more than half of the electricity going to data centers will be used for AI. At that point, AI alone could consume as much electricity annually as 22% of all US households.
The article examines the specific energy costs of text, image, and video generation, noting that while individual queries are small, the aggregate impact is massive. It highlights a lack of transparency from closed-source AI providers and the potential for increased electricity costs for residential ratepayers.