Comment on Librarians Are Tired of Being Accused of Hiding Secret Books That Were Made Up by AI

<- View Parent
vivalapivo@lemmy.today ⁨1⁩ ⁨week⁩ ago

Hallucination is not just a mistake, if I understand it correctly. LLMs make mistakes and this is the primary reason why I don’t use them for my coding job.

Like a year ago, ChatGPT made out a python library with a made out api to solve my particular problem that I asked for. Maybe the last hallucination I can recall was about claiming that manual is a keyword in PostgreSQL, which is not.

source
Sort:hotnewtop