It’s the same as playing a 3d game. It’s a GPU or GPU equivalent doing the work. It doesn’t matter if you are asking it to summarize an email or play Red Dead Redemption.
HubertManne@piefed.social 3 days ago
This is the main reason I am reticent about using ai. I can get around its funtional limitations but I need to know they have brought the energy usage down.
Blue_Morpho@lemmy.world 3 days ago
HubertManne@piefed.social 3 days ago
I mean if every web search I do is like playing a 3d game then I will stick with web searches. 3d gaming is the most energy intensive thing I do on a computer.
slazer2au@lemmy.world 3 days ago
You can run one on your PC locally so you know how much power it is consuming
HubertManne@piefed.social 3 days ago
I already stress my laptop with what I do so I doubt I will do that anytime soon. I tend to use pretty old hardware though. 5 year plus. honestly closer to 10.
IndiBrony@lemmy.world 3 days ago
Can’t remember the last time my hardware was younger than 10 years old 😂…😭
HubertManne@piefed.social 3 days ago
Mines actually just shy. 2017 manufacture.
a_wild_mimic_appears@lemmy.dbzer0.com 3 days ago
How much further down than 3W/request can you go? i hope you don’t let your microwave run 10 seconds longer than optimal, because that’s exactly the amount of energy we are talking about. Or running a 5W nightlight for a bit over half an hour.
LLM and Image generation are not what kills the climate. What does are flights, cars, meat, and bad insulation of houses leading to high energy usage in winter. Even if we turned off all GenAI, it wouldn’t even leave a dent compared to those behemoths.
fading_person@lemmy.zip 2 days ago
Where is this 3W from? W isn’t even an energy unit, but a power unit.
a_wild_mimic_appears@lemmy.dbzer0.com 2 days ago
sorry, it should be 3 Wh, you are correct of course. The 3Wh come from here:
- Aug 2023: 1.7 - 2.6 Wh - Towards Data Science
- Oct 2023: 2.9 Wh - Alex de Vries
- May 2024: 2.9 Wh - Electric Power Research Institute
- Feb 2025: 0.3 Wh - Epoch AI
- May 2025: 1.9 Wh for a similar model - MIT Technology Review
- June 2025: Sam Altman himself claims it’s about 0.3 Wh
Every source here stays below 3Wh, so it’s reasonable to use 3Wh as an upper bound. (I wouldn’t trust Altmans 0,3 Wh tho lol)
fading_person@lemmy.zip 2 days ago
Thanks for linking the sources. I will take a look into that
HubertManne@piefed.social 2 days ago
sorry. there were two conversations and Im getting confused. Are you talking local where I don't have the overhead? Or using them online where I am worried about the energy usage?
a_wild_mimic_appears@lemmy.dbzer0.com 2 days ago
that’s online, what is used in the data centers per request. local is probably not so different, depending on device - different devices have different architectures which might be more or less optimal, but the cooling is passive.
HubertManne@piefed.social 2 days ago
Yeah the thing is its not comparing each request to an airline flight its comparing each one to a web search. Its utility is not that much greater its just a convenience. Its like with bitcoin where its about energy per transaction compared to a credit card transactions. I mean I search the web everyday a whole bunch and way more when im working.
jj4211@lemmy.world 2 days ago
A local LLM probably costs about as much as the online LLM, but that usage is so widely distributed it’s not as impactful to any specific locations.
taiyang@lemmy.world 3 days ago
It’s not that bad when it’s just you fucking around having it write fanfics instead of doing something more taxing, like playing an AAA video game or, idk, run a microwave or whatever it is normies do. Training a model is very taxing, but running them isn’t and the opportunity cost might even be net positive if you tend to use your gpu a lot.
It becomes more of a problem when everyone is doing it when it’s not needed, like reading and writing emails. There’s no net positive, it’s a very large scale usage, and brains are a hell of a lot more efficient at it. This use case has gotta be one of the dumbest imaginable, all while making people legitimately dumber using it over time.
HubertManne@piefed.social 3 days ago
oh you are talking locally I think. I play games on my steamdeck as my laptop could not handle it at all.
a_wild_mimic_appears@lemmy.dbzer0.com 3 days ago
Your steam deck at full power (15W TDP per default) equals 5 ChatGPT requests per hour. Do you feel guilty yet? No? And you shouldn’t!
taiyang@lemmy.world 3 days ago
Yup, and the deck can do stuff at an astounding low wattage, like 3W. Meanwhile there’s gpus that can run at like 400W-800W, like when people used to use two 1080s SLI. I always found it crazy when I saw a guy running a system burning as much electricity as a weak microwave just to play a game, lol. Kept his house warm, tho.
Trainguyrom@reddthat.com 1 day ago
The rule of thumb from data centers is every watt of compute equals 3 watts of energy consumption, 1 to power the thing and 2 to remove the 1 watt of heat, so high power components are really a ton of power