It’s more an argument against using LLMs for things they’re not intended for. LLMs aren’t therapists, they’re text generators. If you ask it about suicide, it makes a lot of sense for it to generate text relevant to suicide, just like a search engine should.
The real issue here is the parents either weren’t noticing or not responding to the kid’s pain. They should be the first line of defense, and enlist professional help for things they can’t handle themselves.
yermaw@sh.itjust.works 3 weeks ago
You’re absolutely right, but the counterpoint that always wins - “there’s money to be made fuck you and fuck your humanity”
ganryuu@lemmy.ca 3 weeks ago
Can’t argue there…