The thing is: those AI datacenters are used for a lot of things, LLM’s usage amount to about 3% of usage, the rest is for stuff like image analysis, facial recognition, market analysis, recommendation services for streaming platforms and so on. And even the water usage is not really the big ticket item:
The issue of placement of data centers is another discussion, and i agree with you that placing data centers in locations that are not able to support them is bullshit. But people seem to simply not realize that everything we do has a cost. The US energy system uses 58 trillion gallons of water in withdrawals each year. ChatGPT use about 360 million liters/year, which comes down to 0.006% of Americas water usage / year. An average american household uses about 160 gallons of water / day; ChatGPT requests use about 20-50 ml/request. If you want to save water, go vegan or fix water pipes.
Bytemeister@lemmy.world 1 week ago
Well, real quick, my drive to the office is ~10 miles. My car gets ~3.1 miles/kwh. So let’s say I use 3 KWH per trip, two trips a day, makes it 6KWH. A typical LLM request used 0.0015KWH of electricity, so my single trip in my car uses ~4000 LLM queries worth of electricity.
Yeah RTO is way worse, even for an EV that gets 91MPGe.