Comment on Stack Overflow in freefall: 78 percent drop in number of questions
ramble81@lemmy.zip 3 weeks ago
Serious question here. LLMs trained their data off SO. Developers now ask LLMs for solutions instead of SO. New technology comes out that LLMs don’t have indexed. Where will LLMs get their data to train on for new technologies? You can’t exactly feed it a manual and expect it to extrapolate or understand (for that matter “what manual).
xthexder@l.sw0.com 3 weeks ago
dantheclamman@lemmy.world 3 weeks ago
I am worried, because there are increasing cases where open source docs are going offline because they can’t take the bandwidth costs of the big LLM bots recrawling hundreds of times per day. Wikipedia is also getting hammered. There is so much waste and diminishing returns
General_Effort@lemmy.world 3 weeks ago
You can’t exactly feed it a manual and expect it to extrapolate or understand (for that matter “what manual).
You can do that to a degree (RLVR). They are also paying human experts. But that’s the situation now. Who knows how it will be in a couple more years. Maybe training AIs will be like writing a library, framework, …
ThomasWilliams@lemmy.world 3 weeks ago
From the questions people ask and from online accounts.
Prox@lemmy.world 3 weeks ago
Yes, that is the major problem with LLMs in general. There is no solution aside from “train in another different source (like Reddit)”, but then we rinse & repeat.