You an make a car super safe, and if someone cuts the brakes cable it will not slow.
Comment on On Tuesday afternoon, ChatGPT encouraged me to cut my wrists
unexposedhazard@discuss.tchncs.de 2 days agopurposely putting extreme effort into tricking the LLM into saying something harmful
I dont think you understand how many mentally unstable people out there are using LLMs as therapists.
There are fuckloads of cases now of people that were genuinely misled by LLMs into doing horrible things.
I agree with your sentiment, but the thing is that the companies are selling their shit as gold and not as shit. If they were honest about their stuff being shit, then people wouldnt be able to capitalize off of malicious prompt engineering. If you claim “we made it super safe and stuff” then you are inviting people to test that.
h3rmit@lemmy.dbzer0.com 2 days ago
Spuddlesv2@lemmy.ca 2 days ago
More like SAYING you’ve made a car super safe while actually it’s only safe if you never drive it above 5KM/h and never down hills, up hills, in the rain, etc
Dojan@pawb.social 2 days ago
I do understand. I know there are people out there thinking that LLMs are literally Jesus returning. But in that case, write about that.
There are a lot more reasonable critiques of LLMs to be had.