Well, yeah. The people who host them for profit should be held liable.
Electricd@lemmybefree.net 8 months ago
LLMs can’t be fully controlled. They shouldn’t be held liable
JakenVeina@midwest.social 8 months ago
Epzillon@lemmy.world 8 months ago
“Ugrh guys, we dont know how this machine works so we should definetly install it in every corporation, home and device. If it kills someone we shouldnt be held liable for our product.”
Not seeing the irony in this is beyond me. Is this a troll account?
If you cant guarantee the safety of a product, limit or restrict its use cases or provide safety guidelines or regulations you should not sell the product. It is completely fair to blame the product and the ones who sell/manifacture it.
Electricd@lemmybefree.net 8 months ago
Safety guidelines are regularly given
If people purchase a knife and behave badly with it, it’s on them
Something writing things isn’t comparable to a machine that could kill you. In the end, it’s always up to the person doing the things
I still wonder how
ClosedOpenAI forcefully installed ChatGPT in this person’s home. Or how it is installed because they don’t have software… Quit your bullshitFeathercrown@lemmy.world 8 months ago
This is more like selling someone a knife that can randomly decide of its own accord to stab them
Electricd@lemmybefree.net 8 months ago
That’s so blatantly false
Epzillon@lemmy.world 8 months ago
Except there are no guidelines or safety regulations in place for AI…
Electricd@lemmybefree.net 8 months ago
Safety guidelines written by chatgpt and other service providers I mean
scratchee@feddit.uk 8 months ago
Neither can humans, ergo nobody should ever be held liable for anything.
Civilisation is a sham, QED.
Electricd@lemmybefree.net 8 months ago
Glad to hear you are an LLM
scratchee@feddit.uk 8 months ago
“Safeguards and regulations make business less efficient” has always been true. They
In this case, if they can’t figure out how to control LLMs without crippling them, that’s pretty absolute proof that LLMs should not be used. What good is a tool you can’t control?
“I cannot regulate this nuclear plant without the power dropping, so I’ll just run it unregulated”.
Electricd@lemmybefree.net 8 months ago
Some food additives are responsible for cancer yet are still allowed, because they are generally more useful than have negative effects. Where you draw the line is up to you, but if you’re strict, you should still let people choose for themselves
LLMs are incredibly useful for a lot of things, and really bad at others. Why can’t people use the tool as intended, rather than stretching it to other unapproved usages, putting themselves at risk?
surewhynotlem@lemmy.world 8 months ago
I made this car with a random number generator that occasionally blows it up. Its cheap so lots of people buy it. Totally not my fault that it blows up though. I mean yes, I designed it, and I know it occasionally explodes. But I can’t be sure when it will blow up so it’s not my fault.
Electricd@lemmybefree.net 8 months ago
Comparing an automated system saying something bad with a car exploding is really fucking dumb
surewhynotlem@lemmy.world 8 months ago
Because you understood the point?
BananaIsABerry@lemmy.zip 8 months ago
Perhaps we should also hold the rope, knife, and various chemical manufacturers responsible.
The bridge architect? He designed a bridge that people jumped off of, so he’s at fault for sure.