Ben Daubney

Does a generative AI search really burn down the world?


This post is part of a series for #WeblogPoMo2025. Read the introduction here.


A really short answer on this one: it depends on the context

A common objection to using AI is that a single query requires far more energy than performing a similar websearch.

There's no definitive sources on how much power is involved in either of those situations so it's impossible to say for sure.

The most commonly cited stat is that a generative AI query will consume 2.9 watt-Hours (Wh) of power compared to 0.3 Wh for a Google search so, broadly speaking, AI needs about ten times the electricity.

The source for this comes from a 2024 report by the International Energy Agency, a group which "works with governments and industry to shape a secure and sustainable energy future for all". It's hidden away on page 34, included without citation or source.

The IEA isn't an unbiased source. The figures given are arbitrary and out of date. They refer to 'typical' searches without defining what that means. Both figures take no account for the energy required to produce Google's indexes and algorithms or ChatGPT's corpus of weights.

But this is the common stat: 2.9 Wh per AI query, 0.3 Wh for a websearch.


This lacks context.

Let's try to understand if 2.9 Wh is a lot of energy or not.

(Check the footnotes from hereon out. There are a lot of caveats!)

Running an energy saving lightbulb for an hour uses 10 Wh1; if the 2.9 Wh figure is true, that means that about three AI queries is the same as an hour of light.

Using a laptop for an hour requires 5 Wh2. That's about two AI queries.

If you've got a Raspberry Pi Zero connected somewhere to your home network, that's using 16.8 Wh of energy every day3. That's just over five-and-a-half AI queries.

Making a slice of toast requires 20 Wh4. Boiling a kettle to make a cup of tea - the standard British unit of energy consumption - uses 113 Wh5. Making breakfast is therefore the equivalent of 46 AI queries.

Putting it like that, asking the odd question in Perplexity here and there doesn't seem so bad.


But this lacks context!

Five years ago, the average internet user wasn't running any AI queries. The same IEA report that generated the 2 Wh figure gives a pretty stark view of future consumption:

AI electricity demand can be forecast more comprehensively based on the amount of AI servers that are estimated to be sold in the future and their rated power. The AI server market is currently dominated by tech firm NVIDIA, with an estimated 95% market share. In 2023, NVIDIA shipped 100,000 units that consume an average of 7.3 TWh of electricity annually. By 2026, the AI industry is expected to have grown exponentially to consume at least ten times its demand in 2023.

(Emphasis mine)

That 7.3 TWh figure is just the likely consumption of the AI servers sold by one particular vendor in one year. The servers will last multiple years. There will be multiple vendors. That ten-times multiplier is against an unknown current usage but we can certainly say that we're talking well beyond 73 TWh annually.


But this lacks context!

This energy increase will not all be net-new.

Google searches are going down. The increase in power consumption due to AI will be tempered - slightly - by that decrease.

Given that web search results have become pretty awful it may be more efficient to run one AI query as opposed to a dozen increasingly-frustrated web searches.

And AI models are getting more efficient. The 2.9 Wh figure is a year old prior to the release of DeepSeek R1 with its dramatic efficiency which has spurred other companies to try to match or beat it.

Yet, however you look at it, more power will be consumed as more people use AI services.


In researching this blog, I read this article by Andy Masley. He's pro-AI and his about page talks about 'effective altruism' which raises all sorts of alarm bells.

I don't think he's necessarily right but his articles are well cited, well written, and invite reflection.

His article shows five charts at the very start. One of them shows the reduction in CO2 emissions by asking ChatGPT 50,000 fewer questions per year is around 0.1 tonnes, compared to an annual saving of 2 tonnes by switching to an electric car. Another shows the water used in a data centre per AI prompt is comparable to that of downloading an app. One shows that the water involved in making one hamburger is the equivalent of 198,000 ChatGPT requests. Another says that Fortnite uses more energy than the whole of ChatGPT as a company every day.

Well, hey! Put like that, the water and energy use from ChatGPT-ing all over the place is pretty insignificant!

These stats are all cherry picked. I invite you to review them with the scepticism I have.

I point them out here not to position AI's energy use as ok or trivial, more to appreciate how people who are for or against AI justify their position. Three searches equalling the same power consumption as an hour on a laptop sounds terrible; Fortnite using more power makes it seem trivial.


Masley insists that "using ChatGPT is not bad for the environment".

I don't agree. Everything with computers and electricity use is bad for the environment.

But why am I asked to worry about this for AI when it's not an issue for other things? For using an electric toothbrush rather than a manual, or for playing records on a stereo, or streaming music from Spotify, or cooking dinner on an induction hob?

AI is new, controversial, and political. Some will point to energy usage as the main reason why they dislike generative AI; to many others it's a helpful additional reason to dislike AI.

We lack context.


  1. This article reports that "using an LED light bulb for 1 hour per day results in 0.07 kilowatt-hours (kWh) of electricity per week". That's 0.07 kWh for seven hours, or 0.01 kWh per hour. 0.01 kWh is the same as 10 Wh. Maths.

  2. Lots of variability here depending on the laptop and the work it is doing. This blog has a good analysis which estimates 50.34 Wh used per day in the "nine or ten hours" a laptop is "in performance mode", i.e. active use. Dividing that 50 Wh by ten hours gives us 5 Wh. More maths.

  3. This blog reports that a Pi Zero with wi-fi draws 120 mA, or 0.7W, of power. Over twenty-four hours, that's 16.8 Wh.

  4. A lot, huh? Toasters are high wattage - mine requires 1200 watts. Assuming toast takes exactly 60 seconds to cook (it doesn't), that's an energy usage of 72000 joules or 20 watt-Hours.

  5. This article notes that half-filling a kettle and bringing the water to a boil uses 0.113 kw/h. That's 113 Wh.

#AI #WeblogPoMo2025 #main #technology