I get the distinct impression that most of the focus for GPT5 was making it easier to divert their overflowing volume of queries to less expensive routes.
Comment on OpenAI will not disclose GPT-5’s energy use. It could be higher than past models
redsunrise@programming.dev 3 weeks ago
Obviously it’s higher. If it was any lower, they would’ve made a huge announcement out of it to prove they’re better than the competition.
ChaoticEntropy@feddit.uk 3 weeks ago
morrowind@lemmy.ml 3 weeks ago
It’s cheaper though, so very likely it’s more efficient somehow.
SonOfAntenora@lemmy.world 3 weeks ago
I believe in verifiable statements and so far,with few exceptions, I saw nothing. We are now speculating on magical numbers that we can’t see, but we know that ai is demanding and we know that even small models are not free. The only accessible data come from mistral, most other ai devs are not exactly happy to share the inner workings of their tools. Even than, mistral didn’t release all their data, even if they did it would only apply to mistral 7b and above, not to chatgpt.
Sl00k@programming.dev 3 weeks ago
The only accessible data come from mistral, most other ai devs are not exactly happy to share the inner workings of their tools. Important to point out this is really only valid towards Western AI companies. Chinese AI models have mostly been open source with open papers.
T156@lemmy.world 3 weeks ago
Unless it wasn’t as low as they wanted it. It’s at least cheap enough to run that they can afford to drop the pricing on the API compared to their older models.
thatcrow@ttrpg.network 3 weeks ago
It warms me heart to see ya’ll finally tune-in to the scumbag tactics our abusers constantly employ.
Ugurcan@lemmy.world 3 weeks ago
I’m thinking otherwise. I think GPT5 is a much smaller model with some fallback to previous models when needed.
Since it’s running on the exact same hardware with a mostly similar algorithm, using less energy would directly mean it’s a “less intense” model, which translates into an inferior quality in Americanish.
PostaL@lemmy.world 3 weeks ago
And they don’t want to disclose the energy efficiency becaaaause … ?
AnarchistArtificer@slrpnk.net 3 weeks ago
Because the AI industry is a bubble that exists to sell more GPUs and drive fossil fuel demand
Hobo@lemmy.world 3 weeks ago
Because, uhhh, whoa what’s that?
ducks behind the podium
RobotZap10000@feddit.nl 3 weeks ago
They probably wouldn’t really care how efficient it is, but they certainly would care that the costs are low.
Ugurcan@lemmy.world 3 weeks ago
I’m almost sure they’re keeping that for the Earnings call.
panda_abyss@lemmy.ca 3 weeks ago
Do they do earnings calls? They’re not public.
Sl00k@programming.dev 3 weeks ago
It also has a very flexible “thinking” nature, which means far far less tokens spent on most peoples responses.