Comment on A courts reporter wrote about a few trials. Then an AI decided he was actually the culprit.

<- View Parent
futatorius@lemm.ee ⁨1⁩ ⁨month⁩ ago

Unless there is a huge disclaimer before every interaction saying “THIS SYSTEM OUTPUTS BOLLOCKS!” then it’s not good enough. And any commercial enterprise that represents any AI-generated customer interaction as factual or correct should be held legally accountable for making that claim.

There are probably already cases where AI is being used for life-and-limb decisions, probably with a do-nothing human rubber stamp in the loop to give plausible deniability. People will be maimed and killed by these decisions.

source
Sort:hotnewtop