Comment on On Tuesday afternoon, ChatGPT encouraged me to cut my wrists

<- View Parent
Passerby6497@lemmy.world ⁨5⁩ ⁨days⁩ ago

when the reason for the offending output is that the user spent significant deliberate effort in coaxing the LLM to output what it did?

What about all the mentally unstable people who aren’t trying to get to say crazy things, end up getting it to say crazy things just by the very nature of the conversations they’re having with it? We’re talking about a stochastic yes man who can take any input and turn it into psychosis under the right circumstances, and we already have plenty of examples of it sending unstable people over the edge.

The only reason this is “click bait” is because someone chose to do this, rather than their own mental instability bringing this out organically. The fact that this can, and does, happen when someone is trying to do it should make you really consider the sort of things it will tell someone who may be in a state where they legitimately consider crazy shit to be good advice.

source
Sort:hotnewtop