Yup this is a great example. LLM for non opinion based stuff or for stuff that’s not essential for life. It’s great for finding a recipe but if you’re gonna rely on the internet or an LLM to help you form an opinion on something that requires objective thinking then no. If I said hey internet/LLM is humour good or bad, it would insert a swayed view.
It simply can’t be trusted. I can’t even trust it return shopping links so I have retreated back to real life. If it can’t play fair I no longer use it as a tool.
A_norny_mousse@feddit.org 3 days ago
I’m seeing almost daily the fuck-ups resulting from somebody trying to fix something with ChatGPT, then coming to the forums because it didn’t work.
NewNewAugustEast@lemmy.zip 3 days ago
I agree that happens, but it has nothing to do with what op said. They didn’t want a solution, they wanted a link to where the problem was being discussed so they could work out a solution.
People seem to really confure the difference between asking an llm how to patch a boat vs where did people discuss ways to patch a boat.
Honytawk@feddit.nl 3 days ago
Most likely because if they came directly with their problem to whatever platform you are on, they would have been scolded at for not trying hard enough to solve it on their own. Or close the post because it has already been asked.