Comment on Get. Out

<- View Parent
Brainsploosh@lemmy.world ⁨9⁩ ⁨hours⁩ ago

It doesn’t reason, and it doesn’t actually know any information.

What it excels at is giving plausible sounding averages of texts, and if you think about how little the average person knows you should be abhorred.

Also, where people typically can reason enough to make the answer internally consistent or even relevant within a domain, LLMs offer a polished version of the disjointed amalgamation of all the platitudes or otherwise commonly repeated phrases in the training data.

Basically, you can’t trust the information to be right, insightful or even unpoisoned, while sabotaging your strategies and systems to sift information from noise.

source
Sort:hotnewtop