Comment on ChatGPT is down worldwide, conversations disappeared for users
pahlimur@lemmy.world 23 hours agoIt’s not easy to solve because its not possible to solve. ML has been around since before computers, it’s not magically going to get efficient. The models are already optimized.
Revenue isn’t profit. These companies are the biggest cost sinks ever.
Heating a single building is a joke marketing tactic compared to the actual energy impact these LLM energy sinks have.
I’m an automation engineer, LLMs suck at anything cutting edge. Its basically a mainstream knowledge reproducer with no original outputs. Meaning it can’t do anything that isnt already done.
NotMyOldRedditName@lemmy.world 23 hours ago
Why on earth do you think things can’t be optimized on the LLM level?
There are constant improvements being made there, they are not in any way shape or form fully optimized yet. Go follow the /r/LocalLlama sub for example and there’s constant breakthroughs happening, and then a few months later you see a LLM utilizing them come out, and they’re suddenly smaller, or you can run a large model on smaller memory footprint, or you can get a larger context on the same hardware etc.
This is all so fucking early, to be so naive to think that they’re as optimized as they can get is hilarious.
pahlimur@lemmy.world 23 hours ago
I’ll take a step back. These LLM models are interesting. They are being trained in interesting new ways. They are becoming more ‘accurate’, I guess. ‘Accuracy’ is very subjective and can be manipulated.
Machine learning is still the same though.
LLMs still will never expand beyond their inputs.
My point is it’s not early anymore. We are near or past the peak of LLM development. The extreme amount of resources being thrown at it is the sign that we are near the end.
That sub should not be used to justify anything, just like any subreddit at any point in time.
NotMyOldRedditName@lemmy.world 22 hours ago
I think we’re just going to have to agree to disagree on this part.
I’ll agree though that IF what you’re saying is true, then they won’t succeed.
pahlimur@lemmy.world 22 hours ago
Fair enough. I’d be fine being wrong.
Improved efficiency would reduce the catastrophic energy demands LLMs will have in the future. Assuming your reality comes true it would help reduce their environmental impact.
We’ll see. This isn’t first “it’s the future” technology I’ve seen and I’m barely 40.