Comment on The people like AI because they treat it like a search engine.
chonglibloodsport@lemmy.world 3 days ago
They get an answer but unlike a search engine, the AI doesn’t show its work. I want a citation with the answer, I’m not taking your word for it!
boonhet@sopuli.xyz 3 days ago
Eh? You can ask it to provide sources and it will. Or Google AI does it by default
There’s lots of things wrong with AI, but that’s actually not one of them much of the time.
Infrapink@thebrainbin.org 3 days ago
There is no guarantee those sources say what the answer says, or indeed that they actually exist. Generators can and do assemble words into phrases that look like citations, but those sources don't exist. It's actually a problem for librarians, who keep getting accused of hiding nonexistent books "cited" by ChatGPT
Or in comic form:
boonhet@sopuli.xyz 3 days ago
Oh you definitely have to double check. But what’s the point of sources if you don’t check them anyway?
And links in particular are super easy to check. Books and articles obviously less so
chonglibloodsport@lemmy.world 3 days ago
Oh interesting. It should do this by default then.
Defaults matter. They normalize patterns of behaviour. People who are normalized not to care about citations are being trained to blindly accept whatever they’re told. That’s a recipe for an unthinking, obedient, submissive society.
4am@lemmy.zip 3 days ago
Congratulations, you’re now caught up on the last two decades
chonglibloodsport@lemmy.world 3 days ago
Oh this has been going on for centuries. Technology is always changing and so is culture! I think it’s usually the case that technology changes first and culture takes a while to catch up.
pinball_wizard@lemmy.zip 3 days ago
I find it interesting that every online AI I have encountered hides it’s work, while the open source locally hosted versions default to showing their work.
I’m not sure I have a grasp on the various motivations in play, but it doesn’t feel nice.
RampantParanoia2365@lemmy.world 3 days ago
Yes and no. It sometimes kind of tries to extrapolate from lots of sources and just gives you a few of them that don’t really give an answer.