No. No no no. It is still being developed, with exponentially increasing resources. You downloading the model adds at least 1 but probably 10 increment to the “downloads” and repo watches CEOs 100% use to validate their insane echo chamber. And you’re literally paying for it all if you live in US and they built a data center in your neighborhood and your electricity bill 4x increased! Or if you ever want to upgrade to ddr5, and ever need more storage space! Or in many other myriad of ways!
Generally, you’re right, it’s just the left over tools from the gold rush, why not use them if they’re useful! No point in throwing them away. It’s good that your honest with yourself and will never validate the wild amounts of cosmically ironical cancer-inducing data centers they (I hope only Musk) are operating, by upgrading your local model distilled by your unfavorite AI cloud company that is negative profit for 5 years and somehow still alive
Oh, you’re right. Everything being plugged into massive datacenters that use more power than whole countries is ok, because your computer somehow doesn’t use much power per query. I stand corrected, sir.
A model being local doesn’t make it magically better than cloud based ones. Especially since your local hardware may not be as optimized as professional hardware.
And I say that as a local AI user.
The way to make it less harmful mainly comes from external factors :
The origin of the electricity (renewable)
Shift consumption to a time of low demand (or use a residential battery)
Use specific hardware for specific tasks (use less energy per token)
WorldsDumbestMan@lemmy.today 2 weeks ago
Good luck arguing that my local AI models are harming Earth.
Yes, you can argue phones themselves/lightbulbs harm Earth. I don’t care at that scale.
Eximius@lemmy.world 2 weeks ago
Are you perchance ignoring the petawatt-hours that was needed to train and distil your local AI model?
WorldsDumbestMan@lemmy.today 2 weeks ago
Yes, it is already developed. Nuclear bombs killed people, does not mean we should stop all research into fusion.
Me downloading the model does not light up a data center somewhere else. I run it on my local laptop.
Eximius@lemmy.world 2 weeks ago
No. No no no. It is still being developed, with exponentially increasing resources. You downloading the model adds at least 1 but probably 10 increment to the “downloads” and repo watches CEOs 100% use to validate their insane echo chamber. And you’re literally paying for it all if you live in US and they built a data center in your neighborhood and your electricity bill 4x increased! Or if you ever want to upgrade to ddr5, and ever need more storage space! Or in many other myriad of ways!
Generally, you’re right, it’s just the left over tools from the gold rush, why not use them if they’re useful! No point in throwing them away. It’s good that your honest with yourself and will never validate the wild amounts of cosmically ironical cancer-inducing data centers they (I hope only Musk) are operating, by upgrading your local model distilled by your unfavorite AI cloud company that is negative profit for 5 years and somehow still alive
kokesh@lemmy.world 2 weeks ago
Oh, you’re right. Everything being plugged into massive datacenters that use more power than whole countries is ok, because your computer somehow doesn’t use much power per query. I stand corrected, sir.
YaxPasaj@lemmy.eco.br 2 weeks ago
Username checks out.
Dremor@lemmy.world 2 weeks ago
A model being local doesn’t make it magically better than cloud based ones. Especially since your local hardware may not be as optimized as professional hardware.
And I say that as a local AI user.
The way to make it less harmful mainly comes from external factors :
WorldsDumbestMan@lemmy.today 2 weeks ago
Yeah, I tend to do a bench test on my Laptop to heat myself up. It’s really neat!
No, I don’t have a gaming rig. It’s not going above 65W, and I actually have an ideological reason I want to switch to solar power too.
Soon, my local AI’s will be solar powered as well, and offline.
Sadly, so far I have the capability to only charge my Tablet and phone, which are actually decent for AI in their own right.
Dremor@lemmy.world 2 weeks ago
Better than a benchmark, you should try Boinc or Folding@Home.
Heating by science.