What’s the Energy Consumption of AI? Those in the Know Aren’t Disclosing It

What’s the Energy Consumption of AI? Those in the Know Aren’t Disclosing It

“People are often interested in the energy consumption of a ChatGPT query,” Sam Altman, OpenAI’s CEO, noted in a recent blog post. He indicated that the average query requires 0.34 watt-hours of energy: “Approximately what an oven uses in just over a second, or what a high-efficiency lightbulb consumes in a few minutes.”

For a company boasting 800 million weekly active users (and counting), understanding the total energy consumption of these searches is becoming increasingly crucial. However, experts argue that Altman’s figure lacks significance without further public transparency from OpenAI regarding the methodology behind it—including the criteria for what constitutes an “average” query, whether image generation is considered, and if additional energy expenditures, such as those for training AI models and cooling OpenAI’s servers, are accounted for.

Consequently, Sasha Luccioni, climate lead at AI firm Hugging Face, remains skeptical about Altman’s estimate. “He could have pulled that out of his ass,” she comments. (OpenAI has yet to respond to inquiries regarding the calculation of this figure.)

As AI continues to permeate our lives, it also promises to revolutionize our energy systems, potentially increasing carbon emissions just as we strive to combat climate change. A new and expanding body of research is now focused on quantifying the actual carbon emissions linked to our AI usage.

This endeavor is complicated by a lack of environmental disclosures from major players like OpenAI. An analysis currently under peer review by Luccioni and three collaborators addresses the urgent need for greater environmental transparency in AI models. In her study, Luccioni and her team utilized data from OpenRouter, a leaderboard for large language model (LLM) usage, revealing that 84 percent of LLM activities in May 2025 were associated with models offering zero environmental disclosure. This means that consumers are predominantly opting for models with entirely unknown environmental effects.

“It astounds me that you can purchase a car and know its miles per gallon, yet we use various AI tools daily without any efficiency metrics, emissions factors, or anything,” Luccioni states. “There are no mandates, no regulations. Given the current climate crisis, this should be a priority for regulators everywhere.”

Due to this transparency deficit, Luccioni asserts that the public is exposed to dubious estimates that are often accepted as fact. For example, you may have heard that an average ChatGPT request consumes ten times more energy than the average Google search. Luccioni and her colleagues trace this claim back to a public statement made by John Hennessy, chairman of Alphabet, Google’s parent company, in 2023.

A statement from a board member of one company (Google) regarding another company’s product (OpenAI), to which he has no direct association, is questionable at best—yet, Luccioni’s analysis shows that this figure has been reiterated repeatedly in media and policy reports. (As I was drafting this piece, I received a pitch containing this specific statistic.)

https://in.linkedin.com/in/rajat-media

Helping D2C Brands Scale with AI-Powered Marketing & Automation 🚀 | $15M+ in Client Revenue | Meta Ads Expert | D2C Performance Marketing Consultant