Comment on Google’s dominance on search is declining – for the first time ever!
FinnFooted@lemmy.world 12 hours agoI get the desire to say this, but I find them extremely helpful in my line of work. Literally everything they say needs to be validated, but so does Wikipedia and we all know that Wikipedia is extremely useful. Its just another tool. But its a very useful tool if you know how to apply it.
taladar@sh.itjust.works 11 hours ago
But Wikipedia is basically correct 99% of the time on basic facts if you look at non-controversial topics where nobody has an incentive to manipulate it. LLMs meanwhile are lucky if 20% of what they see even has any relationship to reality. Not just complex facts either, if an LLM got wrong how many hands a human being has I wouldn’t be surprised.
FinnFooted@lemmy.world 2 hours ago
LLMs with access to the internet are usually about as factually correct as their search results. If it searches someone’s blog, you’re right, the results will suck. But if you tell it to use higher quality resources, it returns better information. They’re good if you know how to use them. And they aren’t good enough to be replacing as many jobs as all these companies are hoping. LLMs are just going to speed up productivity. They need babysitting and validating. But they’re still an extremely useful tool that’s only going to get better and LLMs are here to stay.
taladar@sh.itjust.works 2 hours ago
That is the thing, they are not “only going to get better” because the training has hit a wall and the compute used will have to be reduced since they are losing money with every request currently.
FinnFooted@lemmy.world 2 hours ago
Technology these days works in that they always lose money at the start. Its a really stupid feature of modern startups IMO. Get people dependent and they make money later. I don’t agree with it. I don’t really think oir entire economic system is viable though and that’s another conversation.
But LLMs have been improving exponentially. I was on board with everything you’re saying just a year ago about how they suck and they’re going to hit a wall even. But the don’t need more training data or the processing power. They have those and now they’re refining the LLMs. I have a local LLM on my computer that performs better than chat GPT did a year ago and it’s only a few GB. I run it on a shitty laptop.