boor
@boor@lemmy.world
- Comment on In a first, Google has released data on how much energy an AI prompt uses 3 hours ago:
Thanks for catching, you are right that the average USA home is 10.5MWh/year instead of kWh. I was mistaken. :)
Regarding the remainder, my point is that the scale of modern frontier model training, and the total net-new electricity demand that AI is creating is not trivial. Worrying about other traditional sources of CO2 emissions like air travel and so forth is reasonable, but I disagree with the conclusion that AI infrastructure is not a major environmental and climate change concern. The latest projects are on the scale of 2-5GW per site, and the vast majority of that new electricity capacity will come from natural gas or other hydrocarbons.
- Comment on Jimmy Wales Says Wikipedia Could Use AI. Editors Call It the 'Antithesis of Wikipedia' 17 hours ago:
Why would anyone want an editor that doesn’t fact check?
- Comment on 95% of Companies See ‘Zero Return’ on $30 Billion Generative AI Spend, MIT Report Finds 17 hours ago:
Similar experience here. I recently took the official Google “prompting essentials” course. I kept an open mind and modest expectations; this is a tool that’s here to stay. Best to just approach it as the next Microsoft Word and see how it can add practical value.
The biggest thing I learned is that getting quality outputs will require at least a paragraph-long, thoughtful prompt and 15 minutes of iteration. If I can DIY in less than 30 minutes, the LLM is probably not worth the trouble.
I’m still trying to find use cases (I don’t code), but it often just feels like a problem in search of a solution….
- Comment on In a first, Google has released data on how much energy an AI prompt uses 1 day ago:
Google processes over 5 trillion search queries per year. Attaching an AI inference call to most if not all of those will increase electricity consumption by at least an order of magnitude.
- Comment on In a first, Google has released data on how much energy an AI prompt uses 1 day ago:
AI is the driver of the parabolic spike in global data center buildouts. No other use case comes close in terms of driving new YoY growth in tech infra capex spend.
- Comment on In a first, Google has released data on how much energy an AI prompt uses 1 day ago:
Please show your math.
One Nvidia H100 DGX AI server consumes 10.2kW at 100% utilization, meaning that one hour’s use of one server is equivalent to the electricity consumption of the typical USA home in one year. This is just a single 8-GPU server; it excludes the electricity required by the networking and storage hardware, let alone the electricity required to run the facility’s climate control.
xAI alone has deployed hundreds of thousands of H100 or newer GPUs. Let’s SWAG 160K GPUs = ~20K DGX servers = >200MW for compute alone.
H100 is old. State of the art GB200 NVL72 is 120kW per rack.
Musk is targeting not 160K, but literally one million GPUs deployed by the end of this year. He has literally flown in multiple natural gas power plants which he is now operating without any environmental permits or controls, to the detriment of the locals in Memphis.
This is just one company training one typical frontier model. There are many competitors operating at similar scale and sadly the vast majority of their new capacity is running on hydrocarbons because that’s what they can deploy at the scale they need today.