In total, the median prompt—one that falls in the middle of the range of energy demand—consumes 0.24 watt-hours of electricity, the equivalent of running a standard microwave for about one second. The company also provided average estimates for the water consumption and carbon emissions associated with a text prompt to Gemini.
In a first, Google has released data on how much energy an AI prompt uses
Submitted 18 hours ago by kinther@lemmy.world to technology@lemmy.world
https://www.technologyreview.com/2025/08/21/1122288/google-gemini-ai-energy/
Comments
sbv@sh.itjust.works 17 hours ago
unmagical@lemmy.ml 17 hours ago
There are zero downsides when mentally associating an energy hog with “1 second of use time of the device that is routinely used for minutes at a time.”
ArbitraryValue@sh.itjust.works 16 hours ago
With regard to sugar: when I started counting calories I discovered that the actual amounts of calories in certain foods were not what I intuitively assumed. Some foods turned out to be much less unhealthy than I thought. For example, it turns out that I can eat almost three pints of ice cream a day and not gain weight (as long as I don’t eat anything else). So sometimes instead of eating a normal dinner, I want to eat a whole pint of ice cream and I can do so guilt-free.
Likewise, I use both AI and a microwave, my energy use from AI in a day is apparently less than the energy I use to reheat a cup of tea, so the conclusion that I can use AI however much I want to without significantly affecting my environmental impact is the correct one.
Maaji@lemmynsfw.com 16 hours ago
This doesn’t really track with companies commissioning power plants to support power usage of AI training demand
null@lemmy.nullspace.lol 12 hours ago
It does if you consider that they are actually building them to support power usage of datacenters. And that datacenters are used for a lot more than just AI training.
DarkCloud@lemmy.world 17 hours ago
The article also mentions each enquiry also evaporates 0.26 of a milliliter of water… or “about five drops”.
null@lemmy.nullspace.lol 13 hours ago
I wonder how many people clutching their pearls over this also eat meat…
ganksy@lemmy.world 14 hours ago
In addition:
This report was also strictly limited to text prompts, so it doesn’t represent what’s needed to generate an image or a video.
rowrowrowyourboat@sh.itjust.works 16 hours ago
This feels like PR bullshit to make people feel like AI isn’t all that bad. Assuming what they’re releasing is even true. Not like cigarette, oil, or sugar companies ever lied or anything and put out false studies and misleading data.
However, there are still details that the company isn’t sharing in this report. One major question mark is the total number of queries that Gemini gets each day, which would allow estimates of the AI tool’s total energy demand.
Why wouldn’t they release this. Even if each query uses minimal energy, but there are countless of them a day, it would mean a huge use of energy.
Which is probably what’s happening and why they’re not releasing that number.
the_q@lemmy.zip 16 hours ago
That’s because it is. This is to help fence riders feel better about using a product that factually consumes insane amounts of resources.
NotMyOldRedditName@lemmy.world 14 hours ago
There were people estimating 40w in earlier threads on lemmy which was ridiculous.
This seems more realistic.
Ilovethebomb@sh.itjust.works 10 hours ago
I think that figure came from the article, and was based on some very flawed methodology.
frezik@lemmy.blahaj.zone 13 hours ago
The company has signed agreements to buy over 22 gigawatts of power from sources including solar, wind, geothermal, and advanced nuclear projects since 2010.
None of those advanced nuclear projects are yet actually delivering power, AFAIK. They’re mostly in planning stages.
The above isn’t all to run AI, of course. Nobody was thinking about datacenters just for AI training in 2010. But to be clear, there are 94 nuclear power plants in the US, and a rule of thumb is that they produce 1GW each. So Google is taking up the equivalent of roughly one quarter of the entire US nuclear power industry, but doing it with solar/wind/geothermal that could be used to drop our fossil fuel dependence elsewhere.
How much of that is used to run AI isn’t clear here, but we know it has to be a lot.
L0rdMathias@sh.itjust.works 16 hours ago
median prompt size
Someone didn’t pass statistics, but did pass their marketing data presention classes.
Wake me up when they release useful data.
jim3692@discuss.online 13 hours ago
It is indeed very suspicious that they talk about “median” and not “average”.
For those who don’t understand what the difference is, think of the following numbers:
1, 2, 3, 34, 40
The median is 3, because it’s in the middle.
The average is 16 (1+2+3+34+40=80, 80/5=16).
HubertManne@piefed.social 13 hours ago
the big thing to me is I want them to compare the same thing with web searches. so they want to use median then fine but median ai query to median google search.
StrangeMed@lemmy.world 17 hours ago
Nice share! Mistral also shared data about one of its largest model (not the one that answer in LeChat, since that one is Medium, a smaller model, that I guess has smaller energetic requirements)
salty_chief@lemmy.world 17 hours ago
So as thought virtually no impact. AI is here and not leaving. It will outlast humans on earth probably.
tekato@lemmy.world 17 hours ago
Let’s see OpenAI’s numbers
FarraigePlaisteach@lemmy.world 15 hours ago
Microwaves are very energy heavy. This isn’t very reassuring at all.
Armok_the_bunny@lemmy.world 17 hours ago
Cool, now how much power was consumed before even a single prompt was ran in training that model, and how much power is consumed on an ongoing basis adding new data to those AI models even without user prompts. Also how much power was consumed with each query before AI was shoved down our throats, and how many prompts does an average user make per day?
Grimy@lemmy.world 16 hours ago
I did some quick math with metas llama model and the training cost was about a flight to Europe’s worth of energy, not a lot when you take in the amount of people that use it compared to the flight.
Whatever you’re imagining as the impact, it’s probably a lot less. AI is much closer to video games then things that are actually a problem for the environment like cars, planes, deep sea fishing, mining, etc. The impact is virtually zero if we had a proper grid based on renewable.
Damage@feddit.it 16 hours ago
If their energy consumption actually was so small, why are they seeking to use nuclear reactors to power data centres now?
fmstrat@lemmy.nowsci.com 7 hours ago
I’d like to understand what this math was before accepting this as fact.
douglasg14b@lemmy.world 5 hours ago
A flight to Europe’s worth of energy is a pretty asinine way to measure this. Is it not?
It’s also not that small the number, being ~600 Megawatts of energy
taiyang@lemmy.world 16 hours ago
I usually liken it to video games, ya. Is it worse that nothing? Sure, but that flight or road trip, etc, is a bigger concern. Not to mention even before AI we’ve had industrial usage of energy and water usage that isn’t sustainable… almonds in CA alone are a bigger problem than AI, for instance.
Not that I’m pro-AI cause it’s a huge headache from so many other perspectives, but the environmental argument isn’t enough. Corpo greed is probably the biggest argument against it, imo.