I noticed that. When I ask it about things that I am knowledgeable about or simply wish to troubleshoot I often find myself having to correct it. This does make me hestitant to follow the instructions given on something I DON’T know much about.
Comment on AI chatbots unable to accurately summarise news, BBC finds
Knock_Knock_Lemmy_In@lemmy.world 1 week agoTreat LLMs like a super knowledgeable, enthusiastic, arrogant, unimaginative intern.
Phoenicianpirate@lemm.ee 1 week ago
Knock_Knock_Lemmy_In@lemmy.world 1 week ago
Oh yes. The LLM will lie to you, confidently.
Phoenicianpirate@lemm.ee 1 week ago
Exactly. I think this is a good barometer of gauging whether or not you can trust it. Ask it about things you know you’re good at or knowledgeable about. If it is giving good information, the type you would give out, then it is probably OK. If it is bullshitting you or making you go ‘uhh, no, actually…’ then you need to do more old-school research.
milicent_bystandr@lemm.ee 1 week ago
Super knowledgeable but with patchy knowledge, so they’ll confidently say something that practically everyone else in the company knows is flat out wrong.