It’s not that bad when it’s just you fucking around having it write fanfics instead of doing something more taxing, like playing an AAA video game or, idk, run a microwave or whatever it is normies do. Training a model is very taxing, but running them isn’t and the opportunity cost might even be net positive if you tend to use your gpu a lot.
It becomes more of a problem when everyone is doing it when it’s not needed, like reading and writing emails. There’s no net positive, it’s a very large scale usage, and brains are a hell of a lot more efficient at it. This use case has gotta be one of the dumbest imaginable, all while making people legitimately dumber using it over time.
HubertManne@piefed.social 3 days ago
oh you are talking locally I think. I play games on my steamdeck as my laptop could not handle it at all.
a_wild_mimic_appears@lemmy.dbzer0.com 2 days ago
Your steam deck at full power (15W TDP per default) equals 5 ChatGPT requests per hour. Do you feel guilty yet? No? And you shouldn’t!
taiyang@lemmy.world 3 days ago
Yup, and the deck can do stuff at an astounding low wattage, like 3W. Meanwhile there’s gpus that can run at like 400W-800W, like when people used to use two 1080s SLI. I always found it crazy when I saw a guy running a system burning as much electricity as a weak microwave just to play a game, lol. Kept his house warm, tho.
Trainguyrom@reddthat.com 1 day ago
The rule of thumb from data centers is every watt of compute equals 3 watts of energy consumption, 1 to power the thing and 2 to remove the 1 watt of heat, so high power components are really a ton of power