Comment on ChatGPT is down worldwide, conversations disappeared for users

<- View Parent
NotMyOldRedditName@lemmy.world ⁨23⁩ ⁨hours⁩ ago

rely on input data, which is running out.

Thats part of the equation, but there is still a lot of work that can be done to optimize the usage of the llms themselves, and the more optimized and refined they are, the cheaper it becomes to run, and you can also use even bigger datasets that weren’t feasible before.

I think there’s also a lot of room to still optimize the data in the data set. Ingesting the entire worlds information doesn’t lead to the best output, especially if you’re going into something more factual like a LLM trained to assist with programming in a specific language.

And people ARE paying for it today, OpenAI has billions in revenue, the problem is the hardware is so expensive, the data centeres need to run it are also expensive. They need to continue optimizing things to narrow that gap. Open AI charges $20 USD/month for their base paid plan. They have millions of paying customers, but millions isn’t enough to offset their costs.

So they can

  1. reduce costs so millions is enough
  2. make it more useful so they can gain more users.

This is so early that they have room to both improve 1 and 2.

But like I said, they (and others like them) need to figure that out before they run out of money and everything falls apart and needs to be built back up in a more sustainable way.

source
Sort:hotnewtop