False positives are fine because they only happen to poors.
Comment on AI-powered weapons scanners used in NYC subway found zero guns in one month test
tal@lemmy.today 5 days ago
In total, there were 118 false positives — a rate of 4.29%.
Earlier this year, investors filed a class-action lawsuit, accusing company executives of overstating the devices’ capabilities and claiming that “Evolv does not reliably detect knives or guns.”
I mean, I’d be more concerned about the false positive rate than the false negative rate, given the context. Like, if you miss a gun, whatever. That’s just the status quo, which has been working. Some money gets wasted on the machine. But if you are incorrectly stopping more than 1 in 25 New Yorkers from getting on their train, and apply that to all subway riders, that sounds like a monumental clusterfuck.
barsquid@lemmy.world 4 days ago
dezmd@lemmy.world 4 days ago
“Not Hotdog”
jettrscga@lemmy.world 5 days ago
With how trigger happy police are, the false positives would lead to more deaths than they prevent. And police would claim it’s justified because the machine told them so.
Toribor@corndog.social 4 days ago
Facial recognition confirmed he was a criminal and the scanner confirmed he had a gun! Of course we opened fire instantly. How could we have known it was just some guy with a water bottle?
SomethingBurger@jlai.lu 4 days ago
We used advanced colorimetry to determine he was a criminal!