Comment on Teen killed himself after ‘months of encouragement from ChatGPT’, lawsuit claims
LillyPip@lemmy.ca 1 week agobut you can’t blame a machine for doing something that it doesn’t even understand.
But you can blame the creators and operators of that machine for operating unethically.
If I build and sell a coffee maker that sometimes malfunctions and kills people, I’ll be sued into oblivion, and my coffee maker will be removed from the market. You don’t blame the coffee maker, but you absolutely hold the creator accountable.
Occhioverde@feddit.it 1 week ago
Yes and no. The example you made is of a defective device, not of an “unethical” one.
For LLMs we know damn well that they shouldn’t be used as a therapist or as a digital friend to ask for advice; they are no more than a powerful search engine.
An example that is more in line with the situation we’re analyzing is a kid that stabs itself with a knife after his parents left him playing with one; are you sure you want to sue the company that made the knife in that scenario?
LillyPip@lemmy.ca 1 week ago
Not really, though.
The parents know the knife can be used to stab people. It’s a dangerous implement, and people are killed with knives all the time.
LLMs aren’t sold as weapons, or even as tools that can be used as weapons. They’re sold as totally benign tools that can’t reasonably be considered as dangerous.
That’s the difference. If you’re paying especially close attention to, you may potentially understand they can be dangerously, but most people are just buying a coffee maker.