Isnt it just worse than 4 tho? If they didnt make it cheaper, nobody would pay…
eager_eagle@lemmy.world 3 days ago
Bit of a clickbait. We can’t really say it without more info.
But it’s important to point out that the lab’s test methodology is far from ideal.
The team measured GPT-5’s power consumption by combining two key factors: how long the model took to respond to a given request, and the estimated average power draw of the hardware running it.
What we do know is that the price went down. So this could be a strong indication the model is, in fact, more energy efficient. At least a stronger indicator than response time.
unexposedhazard@discuss.tchncs.de 3 days ago
T156@lemmy.world 1 day ago
They could have kept it at the same price, though.
morrowind@lemmy.ml 3 days ago
That’s a terrible metric. By this providers that maximize hardware (and energy) use by having a queue of requests would be seen as having more energy use.