BETA RELEASE

Summary

Analysis of LLM energy use per query, citing Google's Gemini data and other estimates to argue that individual AI use is a small part of most people's carbon footprints.

Key quotes

Google estimates that its median text query uses around 0.24 Wh of electricity.
individual usage of ChatGPT and other LLMs for most people is a small part of their carbon and energy footprint.
generating video seems to be energy-hungry... the advice that “LLMs are a small part of your environmental footprint” might not apply if you’re generating a lot of video.

The post compares energy estimates for LLM queries from Google, OpenAI, and Mistral AI. It notes a significant downward trend in energy per query over time due to efficiency gains.