Sam Altman has gone into PR and hype overdrive lately. He is practically everywhere trying to distract the media from seeing the truth about LLM. GPT-5 has basically proved that we’ve hit a wall and the belief that LLM will just scale linearly with amount of training data is false. He knows AI bubble is bursting and he is scared.
OpenAI will not disclose GPT-5’s energy use. It could be higher than past models
Submitted 15 hours ago by Davriellelouna@lemmy.world to technology@lemmy.world
https://www.theguardian.com/technology/2025/aug/09/open-ai-chat-gpt5-energy-use
Comments
fuzzywombat@lemmy.world 2 hours ago
redsunrise@programming.dev 14 hours ago
Obviously it’s higher. If it was any lower, they would’ve made a huge announcement out of it to prove they’re better than the competition.
Ugurcan@lemmy.world 12 hours ago
I’m thinking otherwise. I think GPT5 is a much smaller model with some fallback to previous models when needed.
Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in Americanish.
PostaL@lemmy.world 10 hours ago
And they don’t want to disclose the energy efficiency becaaaause … ?
RobotZap10000@feddit.nl 11 hours ago
They probably wouldn’t really care how efficient it is, but they certainly would care that the costs are low.
ChaoticEntropy@feddit.uk 9 hours ago
I get the distinct impression that most of the focus for GPT5 was making it easier to divert their overflowing volume of queries to less expensive routes.
T156@lemmy.world 4 hours ago
Unless it wasn’t as low as they wanted it. It’s at least cheap enough to run that they can afford to drop the pricing on the API compared to their older models.
morrowind@lemmy.ml 4 hours ago
It’s cheaper though, so very likely it’s more efficient somehow.
SonOfAntenora@lemmy.world 3 hours ago
I believe in verifiable statements and so far,with few exceptions, I saw nothing. We are now speculating on magical numbers that we can’t see, but we know that ai is demanding and we know that even small models are not free. The only accessible data come from mistral, most other ai devs are not exactly happy to share the inner workings of their tools. Even than, mistral didn’t release all their data, even if they did it would only apply to mistral 7b and above, not to chatgpt.
threeduck@aussie.zone 2 hours ago
All the people here chastising LLMs for resource wastage, I swear to god if you aren’t vegan…
k0e3@lemmy.ca 1 hour ago
What a stupid take.
3abas@lemmy.world 21 minutes ago
It’s not, you’re just personally insulted. The livestock industry is responsible for about 15% of human caused greenhouse gas emissions. That’s not negligible.
stratoscaster@lemmy.world 1 hour ago
What is it with vegans and comparing literally everything to veganism? I was in another thread and it was compared to genocide, rape, and climate change all in the same thread. Insanity
UnderpantsWeevil@lemmy.world 29 minutes ago
I mean, they’re both bad.
But also, “Throw that burger in the trash I’m not eating it” and “Uninstall that plugin, I’m not querying it” have about the same impact on your gross carbon emissions.
These are supply side problems in industries that receive enormous state subsides. Hell, the single biggest improvement to our agriculture policy was when China stopped importing US pork products. So, uh… once again, thank you China for saving the planet.
daveB@sh.itjust.works 7 hours ago
aeronmelon@lemmy.world 12 hours ago
Sam Altman looks like an SNL actor impersonating Sam Altman.
ChaoticEntropy@feddit.uk 9 hours ago
“Herr defr, AI. No, seriously.”
dinckelman@lemmy.world 14 hours ago
Duh. Every company like this “suddenly” starts withholding public progress reports, once their progress fucking goes downhill. Stop giving these parasites handouts
Bloomcole@lemmy.world 3 hours ago
i really hate this cunt’s face.
ZILtoid1991@lemmy.world 9 hours ago
When will genAI be so good, it’ll solve its own energy crisis?
xthexder@l.sw0.com 9 hours ago
Most certainly it won’t happen until after AI has developed a self-preservation bias. It’s too bad the solution is turning off the AI.
kescusay@lemmy.world 14 hours ago
I have to test it with Copilot for work. So far, in my experience its “enhanced capabilities” mostly involve doing things I didn’t ask it to do extremely quickly. For example, it massively fucked up the CSS in an experimental project when I instructed it to extract a React element into its own file.
That’s literally all I wanted it to do, yet it took it upon itself to make all sorts of changes to styling for the entire application. I ended up reverting all of its changes and extracting the element myself.
Suffice to say, I will not be recommending GPT 5 going forward.
Sanguine@lemmy.dbzer0.com 13 hours ago
Sounds like you forgot to instruct it to do a good job.
Dindonmasker@sh.itjust.works 11 hours ago
“If you do anything else then what i asked your mother dies”
GenChadT@programming.dev 14 hours ago
That’s my problem with “AI” in general. It’s seemingly impossible to “engineer” a complete piece of software when using LLMs in any capacity that isn’t editing a line or two inside singular functions. Too many times I’ve asked GPT/Gemini to make a small change to a file and had to revert the request because it’d take it upon itself to re-engineer the architecture of my entire application.
hisao@ani.social 12 hours ago
I make it write entire functions for me, one prompt = one small feature or sometimes one or two functions which are part of a feature, or one refactoring. I make manual edits fast and prompt the next step. It easily does things for me like parsing obscure binary formats or threading new piece of state through the whole application to the levels it’s needed, or doing massive refactorings. Idk why it works so good for me and so bad for other people, maybe it loves me. I only ever used 4.1 and possibly 4o in free mode in Copilot.
Squizzy@lemmy.world 14 hours ago
We moved to m365 and were encouraged to try new elements. I gave copilot an excel sheet, told it to add 5% to each percent in column B and not to go over 100%. It spat out jumbled up data all reading 6000%.
Vanilla_PuddinFudge@infosec.pub 12 hours ago
Ai assumes too fucking much. I’d used it to set up a new 3D printer with klipper to save some searching.
Half the shit it pulled down was Marlin-oriented then it had the gall to blame the config it gave me for it like I wrote it.
“motherfucker, listen here…”
SGforce@lemmy.ca 15 hours ago
It’s the same tech. It would have to be bigger or chew through “reasoning” tokens to beat benchmarks. So yeah, of course it is.
fittedsyllabi@lemmy.world 3 hours ago
But it also could be lower, right?
MentalEdge@sopuli.xyz 15 minutes ago
Unlikely. 5 is just all of OpenAIs previous models in a trenchcoat.
homesweethomeMrL@lemmy.world 11 hours ago
Photographer1: Sam, could you give us a goofier face?
*click* *click*
Photographer2: Goofier!!
*click* *click* *click* *click*
nialv7@lemmy.world 7 hours ago
Looks like he’s going to eat his microphone
cenzorrll@piefed.ca 9 hours ago
He looks like someone in a cult. Wide open eyes, thousand yard stare, not mentally in the same universe as the rest of the world.
cecilkorik@lemmy.ca 13 hours ago
So like, is this whole AI bubble being funded directly by the fossil fuel industry or something? Because the AI training and the instantaneous global adoption of them is using energy like it’s going out of style. Which fossil fuels actually are (going out of style, and being used to power these data centers). Could there be a link? Gotta find a way to burn all the rest of the oil and gas we can get out of the ground before laws make it illegal. Makes sense, in their traditional who gives a fuck about the climate and environment sort of way, doesn’t it?
BillyTheKid@lemmy.ca 12 hours ago
I mean, AI is using like 1-2% of human energy and that’s fucking wild.
My take away is we need more clean energy generation. Good things we’ve got countries like China leading the way in nuclear and renewables!!
cecilkorik@lemmy.ca 12 hours ago
All I know is that I’m getting real tired of this Matrix / Idiocracy Mash-up Movie we’re living in.
ayyy@sh.itjust.works 10 hours ago
Yes, China is producing a lot of solar panels (a good thing!) but the percentage of renewables is actually going down. They are adding coal faster than solar.
Womble@piefed.world 10 hours ago
Do you have a source for that? Because given a chatgpt query takes a similar amount of energy to running a hair dryer for a few seconds i find it hard to believe.
scintilla@crust.piefed.social 12 hours ago
So more energy use for what they people that are into AI are calling a worse model. Is someone going to get fired for this?
vrighter@discuss.tchncs.de 1 hour ago
is there any picture of the guy without his hand up like that?