Comment on Microsoft Study Finds Relying on AI Kills Your Critical Thinking Skills
Petter1@lemm.ee 1 week agoBecause the term fits way better…
Comment on Microsoft Study Finds Relying on AI Kills Your Critical Thinking Skills
Petter1@lemm.ee 1 week agoBecause the term fits way better…
pulsewidth@lemmy.world 1 week ago
A hallucination is a false perception of sensory experiences (sights, sounds, etc).
LLMs don’t have any senses, they have input, algorithms and output. They also have desired output and undesired output.
So, no, ‘hallucinations’ fits far worse than failure or error or bad output. However assigning the term ‘hallucinaton’ does serve the billionaires in marketing their LLMs as actual sentience.
Tehdastehdas@lemmy.world 1 week ago
You might like confabulation better. Or bullshitting.