Comment on People Are Losing Loved Ones to AI-Fueled Spiritual Fantasies

<- View Parent
dzso@lemmy.world ⁨4⁩ ⁨days⁩ ago

So you’re describing a reasoning model, which is 1) still based on statistical token sequences and 2) trained on another tool (logic and discourse) that it uses to arrive at the truth. It’s a very fallible process. I can’t even begin to count the number of times that a reasoning model has given me a completely false conclusion. Research shows that even the most advanced LLMs are giving incorrect answers as much as 40% of the time IIRC. Which reminds me of a really common way that humans arrive at truth, which LLMs aren’t capable of:

Fuck around and find out. Also known as the scientific method.

source
Sort:hotnewtop