Comment on An Algorithm Told Police She Was Safe. Then Her Husband Killed Her.

barsoap@lemm.ee ⁨2⁩ ⁨months⁩ ago

The way to use these kinds of systems is to have the judge came to an independent decision, then, after that’s keyed in, the AI spits out theirs and whichever predicts more danger is then acted on.

Relatedly, the way you have an AI select people and companies to spot-check by tax investigators is not to show investigators the AI scores, but mix in AI suspicions among a stream of randomly selected people.

Relatedly, the way you have AI involved in medical diagnoses is not to tell the human doctor results, but suggest additional tests to be made. The “have you ruled out lupus” approach.

And from what I’ve heard the medical profession actually got that right from the very beginning. They know what priming and bias is. Law enforcement? I fear we’ll have to ELI5 them the basics for the next five hundred years.

source
Sort:hotnewtop