Repost from HN: news.ycombinator.com/item?id=37109394
ChatGPT development, I feel it is completely possible
Submitted 1 year ago by balfrag@lemmy.world to technology@lemmy.world
Repost from HN: news.ycombinator.com/item?id=37109394
LLM’s are pricey to train and evaluate, much more so than compositional models.
But no, OpenAI aren’t going bust due to this. Given that they have the most successful LLM on the market, it’s safe to say that they probably know how much they cost, and can calculate roughly how much their yearly spend will be.
You make a good point.
But maybe they asked their AI model and it mispkaced a decimal. /s
The thing about all GPT models is that they’re based on the frequency of the word to determine its usage. Which means the only way to get good results is if it’s running on cutting edge equipment designed specifically for that job, while being almost a TB in size. Meanwhile, Diffusion models are only GB and run on the GPU but still produce masterpieces because they already know what that word is associated with.
Whatever, it was always some fly by night operation anyway. It’s a cool toy but all this revolutionary crap talk was just like when NFT’s showed up.
well I mean, chatGPT actually does have some real world use. personally, I find chatGPT more helpful than Stack Overflow when it comes to finding problems with my code
Oh no… anyway
Sounds like poor management. I don't see how that is any one else's problem though.
Not while Microsoft keeps pumping them with money.
Anamana@feddit.de 1 year ago
Would help if they would offer more payment options than just credit card, which is not really popular in many countries