1000% less resources
Trump math is spreading
Comment on There's now more people that complain about AI then there is AI content on Lemmy
univers3man@lemmy.world 1 day ago
Good. hopefully it will keep AI out of here. The only way I would be cool with AI is if it took 1000% less resources, and wasn’t powered by companies building datacenters that produce power in teh most polluting way.
1000% less resources
Trump math is spreading
So Deepseek run locally for LLMs or Flux.1 being run locally at 1080 for image gen?
The problem I see with that is even though it’s local (which is a huge step towards FOSS ownership instead of private hidden control), it still takes tons of energy to train the model itself. Not to mention the IP theft which is a whole 'nother issue.
If a single model uses 1Gw of energy (it doesn’t) being trained but 300 million people use it, then it used 3.33 watts of energy per person, and that’s assuming if it was only used a single time.
Using a 900w toaster 8 times to toast 2 slices of bread uses a bit more, at 3.6w - 0.45w for 3 minutes of toasting.
So, I’m not sure if that’s true.
Also, you can use a model which didn’t use IP theft, like Mistral, for LLM, or Photoshop for image gen. That is, if you consider the way it trains IP theft. But then, you’d be supporting corporations like Adobe.
Photoshop are literally stealing their customers images to train their AIs: mashable.com/…/adobe-users-outaged-new-policy-tra…
Watts (and gigawatts) are not a unit of energy. They are a unit of power, or you can think of it as a rate.
900 watts for an hour is 900 watt-hours, or 0.9 kWh. For 24 minutes (3 minutes x8) is 360 Wh, or 0.36kWh.
All of the major public LLM and diffusion models (ChatGPT, copilot, Grok, etc) are absolutely using more than a gigawatt. And I mean constantly. They are trying to create nuclear power plants exclusively to power an AI Datacenter. You could math out how much that is per query (not per person), but it’s absolutely insane.
I admit I didn’t know that Mistral or Photoshop claim to use no copyrighted material, but I am loathe to support Adobe as you seem to correctly imply.
On the topic of power usage, we can assume 1Gw. But 300 million people using the same model as was trained via that inital 1Gw input seems like a stretch for as much as OpenAI / releases models / tweaks. And the root of my problem with the power draw is that it’s not coming from clean or renewal sources so it’s not just the 1Gw of usage, but all the pollution that comes with it. Not to mention the datacenters using water for evap cooling and taking water from towns.
starlinguk@lemmy.world 1 day ago
AI lies, spreads misinformation and steals. So no, I would not be cool with it if it took up fewer natural resources.
univers3man@lemmy.world 1 day ago
Ideally, I was thinking nuclear or renewables. But the way things are now, I don’t want it all. Just wishful thinking on my part.