Comment on An Algorithm Told Police She Was Safe. Then Her Husband Killed Her.

andallthat@lemmy.world ⁨3⁩ ⁨months⁩ ago

About 20 new cases of gender violence arrive every day, each requiring investigation. Providing police protection for every victim would be impossible given staff sizes and budgets.

I think machine-learning is not the key part, the quote above is. All these 20 people a day come to the police for protection, a very small minority of them might be just paranoid, but I’m sure that most of them had some bad shit done to them by their partner already and (in an ideal world) would all deserve some protection. The algorithm’s “success” in defined in the article as reducing probability of repeat attacks, especially the ones eventually leading to death.

The police are trying to focus on the ones who are deemed to be the most at risk. A well-trained algorithm can help reduce the risk vs the judgement of the possibly overworked or inexperienced human handling the complaint? I’ll take that. But people are going to die anyway. Just, hopefully, a bit less of them and I don’t think it’s fair to say that it’s the machine’s fault when they do.

source
Sort:hotnewtop