Comment on ChatGPT is down worldwide, conversations disappeared for users

<- View Parent
NotMyOldRedditName@lemmy.world ⁨23⁩ ⁨hours⁩ ago

Why on earth do you think things can’t be optimized on the LLM level?

There are constant improvements being made there, they are not in any way shape or form fully optimized yet. Go follow the /r/LocalLlama sub for example and there’s constant breakthroughs happening, and then a few months later you see a LLM utilizing them come out, and they’re suddenly smaller, or you can run a large model on smaller memory footprint, or you can get a larger context on the same hardware etc.

This is all so fucking early, to be so naive to think that they’re as optimized as they can get is hilarious.

source
Sort:hotnewtop