The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
There’s such a huge gap between what I read about GPT-5 online, versus the overwhelmingly disappointing results I get from it for both coding and general questions.
I’m beginning to think we’re in the end stages of Dead Internet, where basically nothing you see online has any connection to reality.
Sam_Bass@lemmy.world 7 months ago
And how many milliwatts does an actual brain use?
homesweethomeMrL@lemmy.world 7 months ago
1.21 jigawatts
Sam_Bass@lemmy.world 7 months ago
Ahh but you need 8 for a round trip
Warl0k3@lemmy.world 7 months ago
It’s shockingly tricky to answer precisely, but the commonly held value is that a human brain runs on about 20w, regardless of the computational load placed on it.
Sam_Bass@lemmy.world 7 months ago
So maybe .5 kwh/d. Still vastly more efficient than a code-based one.