Comment on We have to stop ignoring AI’s hallucination problem

Wirlocke@lemmy.blahaj.zone ⁨5⁩ ⁨months⁩ ago

I’m a bit annoyed at all the people being pedantic about the term hallucinate.

Programmers use preexisting concepts as allegory for computer concepts all the time.

Your file isn’t really a file, your desktop isn’t a desk, your recycling bin isn’t a recycling bin.

[Insert the entirety of Object Oriented Programming here]

Neural networks aren’t really neurons, genetic algorithms isn’t really genetics, and the LLM isn’t really hallucinating.

But it easily conveys what the bug is. It only personifies the LLM because the English language almost always personifies the subject. The moment you apply a verb on an object you imply it performed an action, unless you limit yourself to esoteric words/acronyms or you use several words to overexplain everytime.

source
Sort:hotnewtop