Comment on LLMs Will Always Hallucinate
AmbiguousProps@lemmy.today 1 week agoI know about this. But what you’re doing is different. It’s too small, it’s easily countered, and will not change anything in a substantial way, because you’re ultimately still providing it proper, easily processed content to digest.
msage@programming.dev 1 week ago
Also, they can just flag their input.