It is unlikely to turn a profit because the returns need to be greater than the investment for there to be any profit. The trends show that very few want to pay for this service. I mean, why would you pay for something that’s the equivalent of asking someone online or in person for free or very little cost by comparison?
Furthermore, it’s a corporation that steals from you and doesn’t want to be held accountable for anything. For example, the chat bot suicides and the fact that their business model would fall over if they actually had to pay for the data that they use to train their models.
The whole thing is extremely inefficient and makes us more dumb via atrophy. Why would anyone want to third party their thinking process? It’s like thinking everyone wants mobility scooters.
pahlimur@lemmy.world 23 hours ago
The current machine learning models (AI for the stupid) rely on input data, which is running out.
Processing power per watt is stagnating. Moors law hasn’t been true for years.
Who will pay for these services? The dot com bubble destroyed everyone who invested in it. Those that “survived” sprouted off of the corpse of that recession. LLMs will probably survive, but not in the way you assume.
Nvidia helping openAI survive is a sign that the bubble is here and ready to blow.
NotMyOldRedditName@lemmy.world 23 hours ago
Thats part of the equation, but there is still a lot of work that can be done to optimize the usage of the llms themselves, and the more optimized and refined they are, the cheaper it becomes to run, and you can also use even bigger datasets that weren’t feasible before.
I think there’s also a lot of room to still optimize the data in the data set. Ingesting the entire worlds information doesn’t lead to the best output, especially if you’re going into something more factual like a LLM trained to assist with programming in a specific language.
And people ARE paying for it today, OpenAI has billions in revenue, the problem is the hardware is so expensive, the data centeres need to run it are also expensive. They need to continue optimizing things to narrow that gap. Open AI charges $20 USD/month for their base paid plan. They have millions of paying customers, but millions isn’t enough to offset their costs.
So they can
This is so early that they have room to both improve 1 and 2.
But like I said, they (and others like them) need to figure that out before they run out of money and everything falls apart and needs to be built back up in a more sustainable way.
pahlimur@lemmy.world 23 hours ago
None of this is true.
I’ve worked on data centers monitoring power consumption, we need to stop calling LLM power sinks the same thing as data centers. Its basically whitewashing the power sucking environmental disasters that they are.
Machine learning is what you are describing. LLMs being puppeted as AI is destructive marketing and nothing more.
LLMs are somewhat useful at dumb tasks and they do a pretty dumb job at it. They feel like when I was new at my job and for decades could produce mediocre bullshit, but I was to naive to know it sucked. You can’t see how much they suck yet because you lack experience in the areas you use them in.
Your two cost saving points are pulled from nowhere just like LLM inference works.
NotMyOldRedditName@lemmy.world 23 hours ago
lol
leftist_lawyer@lemmy.today 16 hours ago
Oh ffs … stfu.