I’d like to understand what this math was before accepting this as fact.
Comment on In a first, Google has released data on how much energy an AI prompt uses
Grimy@lemmy.world 1 day agoI did some quick math with metas llama model and the training cost was about a flight to Europe’s worth of energy, not a lot when you take in the amount of people that use it compared to the flight.
Whatever you’re imagining as the impact, it’s probably a lot less. AI is much closer to video games then things that are actually a problem for the environment like cars, planes, deep sea fishing, mining, etc. The impact is virtually zero if we had a proper grid based on renewable.
fmstrat@lemmy.nowsci.com 15 hours ago
douglasg14b@lemmy.world 13 hours ago
A flight to Europe’s worth of energy is a pretty asinine way to measure this. Is it not?
It’s also not that small the number, being ~600 Megawatts of energy
xthexder@l.sw0.com 12 hours ago
A Megawatt is a unit of power not energy. It means nothing without including the duration, like Megawatt-hours
taiyang@lemmy.world 1 day ago
I usually liken it to video games, ya. Is it worse that nothing? Sure, but that flight or road trip, etc, is a bigger concern. Not to mention even before AI we’ve had industrial usage of energy and water usage that isn’t sustainable… almonds in CA alone are a bigger problem than AI, for instance.
Not that I’m pro-AI cause it’s a huge headache from so many other perspectives, but the environmental argument isn’t enough. Corpo greed is probably the biggest argument against it, imo.
Damage@feddit.it 1 day ago
If their energy consumption actually was so small, why are they seeking to use nuclear reactors to power data centres now?
null@lemmy.nullspace.lol 22 hours ago
Because demand for data centers is rising, with AI as just one of many reasons.
But that’s not as flashy as telling people it takes the energy of a small country to make a picture of a cat.
Dojan@pawb.social 7 hours ago
Sure we do. Do we want the big tech corporations to hold the reins of that though?
anomnom@sh.itjust.works 12 hours ago
Didn’t xitter just install a gas powered data center that’s breaking EPA rules for emissions?
TomArrr@lemmy.world 6 hours ago
Yes, yes it did. And as far as I can tell, it’s still belching it out, just so magats can keep getting owned by it. What a world
tennesseelookout.com/…/a-billionaire-an-ai-superc…
Imacat@lemmy.dbzer0.com 22 hours ago
To be fair, nuclear power is cool as fuck and would reduce the carbon footprint of all sorts of bullshit.
douglasg14b@lemmy.world 13 hours ago
That’s not small…
100’s of Gigawatts is how much energy that is. Fuel is pretty damn energy dense.
A Boeing 777 might burn 45k Kg of fuel, at a density of 47Mj/kg. Which comes out to…
600 Gigawatts
Or about 60 houses energy usage for a year in the U.S.
It’s an asinine way to measure it to be fair, not only is it incredibly ambiguous, but almost no one has any reference as to how much energy that actually is.
xthexder@l.sw0.com 11 hours ago
That’s not ~600 Megawatts, it’s 587 Megawatt-hours.
Or in other terms that are maybe easier to understand: 5875 fully charged 100kWh Tesla batteries.
Armok_the_bunny@lemmy.world 1 day ago
Volume of requests and power consumption requirements unrelated to requests made, at least I have to assume. Certainly doesn’t help that google has forced me to make a request to their ai every time I run a standard search.
Rentlar@lemmy.ca 23 hours ago
Seriously. I’d be somewhat less concerned about the impact if it was only voluntarily used. Instead, AI is compulsively shoved in every nook and cranny of digital product simply to justify its own existence.
The power requirement for training is ongoing, since mere days after Sam Altman released a very underehelming GPT-5, he begins hyping up the next one.
zlatko@programming.dev 13 hours ago
I also never saw a calculation that took into amount my VPS costs. The fckers scrape half the internet, warming up every server in the world connected to the internet. How much energy is that?
finitebanjo@piefed.world 22 hours ago
Because the training has diminishing returns, meaning the small improvements between (for example purposes) GPT 3 and 4 will need exponentially as much power to have the same effect on GPT 5. In 2022 and 2023 OpenAI and DeepMind both predicted that reaching human accuracy could never be done, the latter concluding even with infinite power.
So in order to get as close as possible then in the future they will need to get as much power as possible. Academic papers outline it as the one true bottleneck.