I guess my opinion will be hugely unpopular but it is what it is - I’d argue it’s natural selection and not an issue of LLM’s in general.
Healthy and (emotionally) inteligent humans don’t get killed by LLM’s. They know it’s a tool, they know it’s just software. It’s not a person and it does not guarantee correctness.
Getting killed because LLM’s told you so - the person was in mental distress already and ready to harm themselves. The LLM’s are basically just the straw that broke the camels back. Same thing with physical danger. If you believe drinking bleach helps with back pain - there is nothing that can save you from your own stupidity.
LLM’s are like a knife. It can be a tool to prepare food or it can be a weapon. It’s up to the one using it.
jaykrown@lemmy.world 4 months ago
en.wikipedia.org/…/Correlation_does_not_imply_cau…
REDACTED@infosec.pub 4 months ago
Seriously. There have been always people with mental problems or tendency towards self harm. You can easily find wand to off yourself on google. You can get bullied on any platform. LLMs are just a tool. How detached from reality you get by reading religious texts or ChatGPT convo highly depends on your own brain.
atrielienz@lemmy.world 4 months ago
I like your username, and generally even agree with you up to a point.
But I think the problem is there are a lot of mentally unwell people out there who are isolated and using this tool (with no safeguards) to interact with socially as a sort of human stand in.
If a human actually agrees that you should kill yourself and talks you into doing it, they are complicit and can be held accountable.
Because chatbots are being… Billed as a product that passes the Turing test, I can understand why people would want the companies that own them to be held accountable.
These companies won’t let you look up how to make a bomb on their LLM, but they’ll let people confide suicidal ideation and not put in any safeguards for that, and because they’re designed to be agreeable, the LLM will agree with a person who tells it they think they should be dead.