It can’t
…and that is an excuse since when?
It can’t, it’s software that needs a governing body to dictate the rules.
It can’t
…and that is an excuse since when?
It’s not an excuse, it doesn’t think or reason.
Unless the software owner sets the governing guardrails it cannot act or present or redact in the way a human can.
Then the software owner needs to be put in prison.
SarahValentine@lemmy.blahaj.zone 3 days ago
The rules are in its code. It was not designed with ethics in mind, it was designed to steal IP, fool people into thinking it’s AI, and be profitable for its creators. They wrote the rules, and they do not care about right or wrong unless it impacts their bottom line.
jacksilver@lemmy.world 3 days ago
The issue is more that there aren’t rules. Given there are billions of parameters that define how these models work, there isn’t really a way to ensure that it cant produce unwanted content.
bold_atlas@lemmy.world 2 days ago
Then they should be banned and made illegal. If one wants to run a LLM locally on their consumer machine then fine, can’t really stop someone from downloading something and they’re paying the electric bill.
But these things should not be running remotely on the internet.
ag10n@lemmy.world 3 days ago
That’s the point, there has to be a human in the loop that sets explicit guard rails
SarahValentine@lemmy.blahaj.zone 3 days ago
No, the point is the humans are there, but they’re the wrong kind of humans who make the wrong kind of guardrails.
ag10n@lemmy.world 3 days ago