I see it exactly the same, I bet you find similar articles about calculators, PCs, internet, smartphones, smartwatches, etc
Society will handle it sooner or later
Comment on Microsoft Study Finds Relying on AI Kills Your Critical Thinking Skills
kratoz29@lemm.ee 1 week ago
Is that it?
One of the things I like more about AI is that it explains to detail each command they output for you, granted, I am aware it can hallucinate, so if I have the slightest doubt about it I usually look in the web too (I use it a lot for Linux basic stuff and docker).
Some people would give a fuck about what it says and just copy & past unknowingly? Sure, that happened too in my teenage days when all the info was shared along many blogs and wikis…
As usual, it is not the AI tool who could fuck our critical thinking but ourselves.
I see it exactly the same, I bet you find similar articles about calculators, PCs, internet, smartphones, smartwatches, etc
Society will handle it sooner or later
LovableSidekick@lemmy.world 1 week ago
I love how they created the term “hallucinate” instead of saying it fails or screws up.
Petter1@lemm.ee 1 week ago
Because the term fits way better…
pulsewidth@lemmy.world 1 week ago
A hallucination is a false perception of sensory experiences (sights, sounds, etc).
LLMs don’t have any senses, they have input, algorithms and output. They also have desired output and undesired output.
So, no, ‘hallucinations’ fits far worse than failure or error or bad output. However assigning the term ‘hallucinaton’ does serve the billionaires in marketing their LLMs as actual sentience.
Tehdastehdas@lemmy.world 1 week ago
You might like confabulation better. Or bullshitting.