Yeah there’s already at least one well known case. This article mentions it wp.api.aclu.org/press-releases/208236
The use of facial recognition technology by Project NOLA and New Orleans police raises serious concerns regarding misidentifications and the targeting of marginalized communities. Consider Randal Reid, for example. He was wrongfully arrested based on faulty Louisiana facial recognition technology, despite never having set foot in the state. The false match cost him his freedom, his dignity, and thousands of dollars in legal fees. That misidentification happened based on a still image run through a facial recognition search in an investigation; the Project NOLA real-time surveillance system supercharges the risks.
“We cannot ignore the real possibility of this tool being weaponized against marginalized communities, especially immigrants, activists, and others whose only crime is speaking out or challenging government policies. These individuals could be added to Project NOLA’s watchlist without the public’s knowledge, and with no accountability or transparency on the part of the police departments
Police use to justify stops and arrests: Alerts are sent directly to a phone app used by officers, enabling immediate stops and detentions based on unverified purported facial recognition matches.
CoolThingAboutMe@aussie.zone 10 months ago
In this video about Lavender AI which is Israel’s Palantir, they talk about the accuracy score that the AI gives targets for how sure it is that they are Hamas. It is used to determine how expensive the weapons to take that person out, and how many innocent bystanders they are willing to take out along with that target.
youtu.be/4RmNJH4UN3s
hendrik@palaver.p3x.de 10 months ago
Thanks, nice video and seems he has some numbers. Very inhuman that they figured out exact numbers how it has an allowance to take out 15/20 bystanders as well. Or an entire elementary school if it's an high ranking "target".