Humans failed this guy.
I am not arguing this point, I agree.
A search engine presents the info that is available, it doesnt also help talk you into doing it.
A stranger doing it in a chatroom doing it should go to prison, as has happened in the past. Should this not also be illegal for LLM’s?
LillyPip@lemmy.ca 5 weeks ago
You should read the filing.
Google might have clinically told him things, but it wouldn’t have encouraged him, telling him he should hide the marks on his neck from a previous failed attempt by wearing a black turtleneck, telling him how to tie the knot next time, and telling him to hide his feelings from his parents and others.
His parents had him in therapy. He also told the AI he wanted to leave a noose out where his parents would find it, and the AI told him not to. It actively encouraged him to hide all this from his parents. A Google search wouldn’t do that, and it sounds like his parents did care.