ZME Science
No Result
View All Result
ZME Science
No Result
View All Result
ZME Science

Home → Science → News

Everyone Thought ChatGPT Used 10 Times More Energy Than Google. Turns Out That’s Not True

Sam Altman revealed GPT-4o uses around 0.3 watthours of energy per query.

Tibi PuiubyTibi Puiu
June 11, 2025
in Future, News
A A
Edited and reviewed by Zoe Gordon
Share on FacebookShare on TwitterSubmit to Reddit
Credit: ZME Science/SORA.

For some time, a figure ricocheted through headlines, white papers, and social media feeds: ChatGPT, it was said, gulps ten times more electricity per question than a Google search. The claim was crisp, alarming, and easy to believe. After all, this was no ordinary piece of software. It could write sonnets, debug code, and explain quantum mechanics — all in a conversational tone.

Surely, it must be burning through megawatts to pull off such feats.

And in a world increasingly worried about carbon emissions and strained power grids, the idea that every typed question to an AI might be quietly gulping down watts caused an uproar.

But a closer look suggests that claim may already be as outdated as dial-up internet.

According to both independent research and none other than Sam Altman, the OpenAI CEO, the latest ChatGPT models use around 0.3 watt-hour per query, which is exactly how much energy Google last reported it used for its average query in 2009 (the last time it reported any such figures).

Revisiting an Outdated Claim

The 10-to-1 energy comparison between ChatGPT and Google searches likely originates from a 2023 estimate by data scientist Alex de Vries. That calculation pegged a ChatGPT query at using roughly 3 watt-hours of electricity.

Meanwhile, the energy cost of a Google search was typically cited as 0.3 watt-hours (roughly 1 kJ) — a figure published by Google back in 2009 and shared by Urs Hölzle, one of Google’s senior VPs at the time. It was a neat comparison, and it stuck.

RelatedPosts

What is Mass-Energy Equivalence (E=mc^2): the most famous formula in science
Google introduces digital Braille keyboard for Android
If the world built nuclear power plants at the rate Sweden had, there would be no need for fossil in 25 years
Air Force plans buildings a solar power station in space and nuclear-powered spacecraft

Yet nearly everything about that framing is worth questioning. Google’s 0.3 Wh estimate came from an era before smartphones were ubiquitous and before YouTube was owned by Google. The internet itself was a different beast. Their data centers have become much more efficient than in 2009. But at the same time, Google is now using AI search in almost all its queries, as you’ve probably noticed in that “AI Overview” box that appears above the fold, pushing back organic “blue link” results.

As for ChatGPT, their models, hardware, and deployment systems have all evolved rapidly in the last year.

Time for a New Benchmark

Estimates of the power cost of ChatGPT queries alongside other power consuming tasks

Recent work by the research team at Epoch.ai — based on technical modeling, data center hardware assumptions, and realistic user behavior — estimates that the average ChatGPT query using OpenAI’s GPT-4o model requires only 0.3 watt-hours of energy.

“This is around 10 times lower than the widely-cited 3 watt-hour estimate!” the authors write.

That number should feel familiar. It’s the same as Google’s 2009 estimate. For reference, the average US household uses 10,500 kilowatt-hours of electricity per year, or over 28,000 watt-hours per day.

Sam Altman, OpenAI’s CEO, echoed the same value in his recent essay, The Gentle Singularity. “People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours,” he wrote.

He compared it to what an oven uses in a second or an LED lightbulb in a few minutes.

Altman’s figure is remarkably similar Epoch.ai’s estimate, which examined the number of floating-point operations (FLOPs) required for a typical query, assumed a realistic number of tokens per output, and accounted for the efficiency of modern GPUs like Nvidia’s H100 — the same chips widely used in AI data centers.

They also applied pessimistic assumptions. They overestimated output length (500 tokens), used worst-case power draw (1500 watts per GPU), and assumed low utilization efficiency (just 10%). Even then, the final number came out to a conservative 0.3 watt-hours, which admittedly can swell to even double digits depending on how intensive the computer requirements are for the query. Attaching a short academic paper or long magazine article to Chat-4o uses around 2.5 watt-hours, while a very long text of 100k tokens (roughly 200 pages of text) would require almost 40 watt-hours.

Comparing Apples to Future Apples

So, what does this mean?

For one, the often-repeated claim that ChatGPT is ten times more energy-intensive than a Google search is no longer supported by current data. It may have once been a fair comparison, based on older hardware and larger model assumptions. But in 2025, it looks increasingly outdated.

And even the 0.3 Wh figure might be on the high end.

GPT-4o is not the only model used in ChatGPT. OpenAI’s GPT-4o-mini, available to free-tier users, is likely even more efficient. It has fewer parameters, a lower cost per token, and faster response times. That means its energy cost per query is probably lower than 0.3 Wh.

More specialized models, such as o1 or o3, could be more energy-intensive. But they’re currently used in niche applications like coding or research workflows. For everyday chatbot queries — answering emails, summarizing text, answering simple questions, casual conversation — the bulk of usage still falls on GPT-4o and its smaller variants.

And what about Google?

It’s hard to say. Google has not released updated energy use data for searches in over 15 years. In that time, search has become more complex, integrating AI overviews, language models, and personalized recommendations. If anything, the true energy cost of a Google search may have increased, but we can’t really tell for sure since Google isn’t transparent in this regard.

Why This Matters

Misunderstandings about AI’s energy footprint have real-world consequences. Policy discussions, public perception, and even funding for green AI initiatives depend on how we frame the technology.

The image of AI as an energy glutton — a carbon-spewing server farm running 24/7 to draft your emails — makes for good headlines. But one must be careful not to fall for exaggerations or outdated claims.

Of course, there are reasons to keep a close eye on AI’s environmental impact. Training large models can require enormous amounts of power. And if AI assistants start running on always-on devices, the total footprint could swell.

A recent comprehensive investigation by MIT Technology Review reveals that the energy demands of AI are reshaping the entire digital infrastructure.

From 2005 to 2017, the electricity usage of data centers remained relatively flat, even as online services exploded. But since AI came on the scene, data center energy use has doubled. Today, around 4.4% of all electricity in the United States goes to data centers, a number expected to triple by 2028.

Why? Because AI is no longer confined to research labs or niche apps. It’s now embedded in search, voice assistants, customer service bots, and even fitness apps. Every AI-powered image, video, or recommendation requires compute — and compute requires power.

Massive investments are already underway. OpenAI and Microsoft are backing the $500 billion Stargate initiative to build AI-centric data centers. Google plans to spend $75 billion on AI infrastructure in 2025 alone. These data centers will rival the energy needs of small countries. Some could require 5 gigawatts of power — more than the entire state of New Hampshire.

The MIT analysis also reminded us of a disturbing blind spot: tech companies rarely disclose how much energy their AI models actually use. Closed-source systems like ChatGPT, Gemini, and Claude are black boxes. When Altman says his technology uses 0.3 Wh per query, you have to take his word for it. Without transparency, it’s nearly impossible for regulators, researchers, or the public to plan for the future or hold companies accountable.

AI models are becoming more personalized, more agentic, and more embedded in our lives. Inference — the energy cost of using AI — is already outpacing training, accounting for 80–90% of AI’s total compute.

So, while the per-query impact may feel minor now, it’s just one frame in a much larger, unfolding picture — one in which AI doesn’t just answer our questions, but helps redraw the lines of the power grid itself.

Tags: AIchatGPTenergyGoogleOpenAI

ShareTweetShare
Tibi Puiu

Tibi Puiu

Tibi is a science journalist and co-founder of ZME Science. He writes mainly about emerging tech, physics, climate, and space. In his spare time, Tibi likes to make weird music on his computer and groom felines. He has a B.Sc in mechanical engineering and an M.Sc in renewable energy systems.

Related Posts

Mind & Brain

Your Brain Uses Only 5% More Energy Whether You’re Actively Thinking or Not. So, What Causes Mental Fatigue?

byTibi Puiu
3 days ago
Future

This AI Can Zoom Into a Photo 256 Times And The Results Look Insane

byTibi Puiu
1 week ago
Health

3D-Printed Pen With Magnetic Ink Can Detect Parkinson’s From Handwriting

byTibi Puiu
1 week ago
Mind & Brain

AI and Brain Scans Reveal Why You Struggle to Recognize Faces of People of Other Races

byTibi Puiu
1 month ago

Recent news

So, Where Is The Center of the Universe?

June 12, 2025

Dehorning Rhinos Looks Brutal But It’s Slashing Poaching Rates by 78 Percent

June 12, 2025

A Chemical Found in Acne Medication Might Help Humans Regrow Limbs Like Salamanders

June 11, 2025
  • About
  • Advertise
  • Editorial Policy
  • Privacy Policy and Terms of Use
  • How we review products
  • Contact

© 2007-2025 ZME Science - Not exactly rocket science. All Rights Reserved.

No Result
View All Result
  • Science News
  • Environment
  • Health
  • Space
  • Future
  • Features
    • Natural Sciences
    • Physics
      • Matter and Energy
      • Quantum Mechanics
      • Thermodynamics
    • Chemistry
      • Periodic Table
      • Applied Chemistry
      • Materials
      • Physical Chemistry
    • Biology
      • Anatomy
      • Biochemistry
      • Ecology
      • Genetics
      • Microbiology
      • Plants and Fungi
    • Geology and Paleontology
      • Planet Earth
      • Earth Dynamics
      • Rocks and Minerals
      • Volcanoes
      • Dinosaurs
      • Fossils
    • Animals
      • Mammals
      • Birds
      • Fish
      • Amphibians
      • Reptiles
      • Invertebrates
      • Pets
      • Conservation
      • Animal facts
    • Climate and Weather
      • Climate change
      • Weather and atmosphere
    • Health
      • Drugs
      • Diseases and Conditions
      • Human Body
      • Mind and Brain
      • Food and Nutrition
      • Wellness
    • History and Humanities
      • Anthropology
      • Archaeology
      • History
      • Economics
      • People
      • Sociology
    • Space & Astronomy
      • The Solar System
      • Sun
      • The Moon
      • Planets
      • Asteroids, meteors & comets
      • Astronomy
      • Astrophysics
      • Cosmology
      • Exoplanets & Alien Life
      • Spaceflight and Exploration
    • Technology
      • Computer Science & IT
      • Engineering
      • Inventions
      • Sustainability
      • Renewable Energy
      • Green Living
    • Culture
    • Resources
  • Videos
  • Reviews
  • About Us
    • About
    • The Team
    • Advertise
    • Contribute
    • Editorial policy
    • Privacy Policy
    • Contact

© 2007-2025 ZME Science - Not exactly rocket science. All Rights Reserved.