Comment on ChatGPT is down worldwide, conversations disappeared for users
redwattlebird@lemmings.world 23 hours agoIt is unlikely to turn a profit because the returns need to be greater than the investment for there to be any profit. The trends show that very few want to pay for this service. I mean, why would you pay for something that’s the equivalent of asking someone online or in person for free or very little cost by comparison?
Furthermore, it’s a corporation that steals from you and doesn’t want to be held accountable for anything. For example, the chat bot suicides and the fact that their business model would fall over if they actually had to pay for the data that they use to train their models.
The whole thing is extremely inefficient and makes us more dumb via atrophy. Why would anyone want to third party their thinking process? It’s like thinking everyone wants mobility scooters.
NotMyOldRedditName@lemmy.world 23 hours ago
These companies have BILLIONS in revenue and millions of customers, and you’re saying very few want to pay…
The money is there, they just need to optimize the LLMs to run more efficiently (this is continually progressing), and the hardware side work on reducing hardware costs as well (including electricity usage / heat generation). If OpenAI can build a datacenter that re-uses all it’s heat for example to heat a hospital nearby, that’s another step towards reaching profitability.
pahlimur@lemmy.world 22 hours ago
It’s not easy to solve because its not possible to solve. ML has been around since before computers, it’s not magically going to get efficient. The models are already optimized.
Revenue isn’t profit. These companies are the biggest cost sinks ever.
Heating a single building is a joke marketing tactic compared to the actual energy impact these LLM energy sinks have.
I’m an automation engineer, LLMs suck at anything cutting edge. Its basically a mainstream knowledge reproducer with no original outputs. Meaning it can’t do anything that isnt already done.
NotMyOldRedditName@lemmy.world 22 hours ago
Why on earth do you think things can’t be optimized on the LLM level?
There are constant improvements being made there, they are not in any way shape or form fully optimized yet. Go follow the /r/LocalLlama sub for example and there’s constant breakthroughs happening, and then a few months later you see a LLM utilizing them come out, and they’re suddenly smaller, or you can run a large model on smaller memory footprint, or you can get a larger context on the same hardware etc.
This is all so fucking early, to be so naive to think that they’re as optimized as they can get is hilarious.
pahlimur@lemmy.world 22 hours ago
I’ll take a step back. These LLM models are interesting. They are being trained in interesting new ways. They are becoming more ‘accurate’, I guess. ‘Accuracy’ is very subjective and can be manipulated.
Machine learning is still the same though.
LLMs still will never expand beyond their inputs.
My point is it’s not early anymore. We are near or past the peak of LLM development. The extreme amount of resources being thrown at it is the sign that we are near the end.
That sub should not be used to justify anything, just like any subreddit at any point in time.
leftist_lawyer@lemmy.today 16 hours ago
Pretty obvious you’re not reading Zitron et al.
redwattlebird@lemmings.world 22 hours ago
Yep, I am. Just follow the money. Here’s an example:
https://www.theregister.com/2025/10/29/microsoft_earnings_q1_26_openai_loss/
… That’s all in your head, mate. I never said that nor did I imply it.
What I am implying is that the uptake is so small compared to the investment that it is unlikely to turn a profit.
😐
I’ve worked in the building industry for over 20 years. This is simply not feasible both from a material standpoint and physics standpoint.
I know it’s an example, but this kind of rhetoric is exactly the kind of wishful thinking that I see in so many people who want LLMs to be a main staple of our everyday lives. Scratch the surface and it’s all just fantasy.
NotMyOldRedditName@lemmy.world 22 hours ago
You > the trends show that very few want to pay for this service.
Me > These companies have BILLIONS in revenue and millions of customers, and you’re saying very few want to pay
Me > … but you’re making it sound no one wants it
You > … That’s all in your head, mate. I never said that nor did I imply it.
Pretty sure it’s not all in my head.
The heat example was just one small example of things these large data centers (not just AI ones) can do to help lower costs, and they are a real thing that are being considered. It’s not a solution their power hungry needs, but it is a small step forward on how we can do things better.
www.bbc.com/news/articles/cew4080092eo
redwattlebird@lemmings.world 19 hours ago
And how, pray tell, will doing all of that return a profit?
I’m from Australia, so I can only speak to the Australian climate and industry. I can confidently say that the model shown in Vienna is not feasible in our country. We simply don’t have much use for excess heat and we are highly susceptible to droughts. DCs use a lot of water to cool down and having these all over the country for private enterprise is bonkers. So, that’s instantly a market that isn’t profitable. Furthermore, it’s not feasible to build a pipe and re-route the heat across large distances with minimal heat loss.
However, even when or if they implement this throughout all of Austria, it won’t return a profit (which is what I thought your attachment was here, not the feasibility. We are talking about profitability, right?). This project cost $3.5m Euro and partially funded by tax. It’s not a great example of profitability but a good example of sustainability measures.
leftist_lawyer@lemmy.today 16 hours ago
just need to optimize. Like they haven’t just been trying for years already with, again, incredibly marginal returns that continue to diminish with each optimization.
Derp.