In this video about Lavender AI which is Israel’s Palantir, they talk about the accuracy score that the AI gives targets for how sure it is that they are Hamas. It is used to determine how expensive the weapons to take that person out, and how many innocent bystanders they are willing to take out along with that target.
Comment on Palantir Started By Spying on a City Now Sells AI for War
hendrik@palaver.p3x.de 3 weeks ago
Would be nice to get some numbers on the accuracy and performance of such a dystopian sci-fi technology (Minority Report?). There should be some, if it's already in operation for 13 years...
CoolThingAboutMe@aussie.zone 3 weeks ago
hendrik@palaver.p3x.de 3 weeks ago
Thanks, nice video and seems he has some numbers. Very inhuman that they figured out exact numbers how it has an allowance to take out 15/20 bystanders as well. Or an entire elementary school if it's an high ranking "target".
AcidicBasicGlitch@lemm.ee 3 weeks ago
Yeah there’s already at least one well known case. This article mentions it wp.api.aclu.org/press-releases/208236
The use of facial recognition technology by Project NOLA and New Orleans police raises serious concerns regarding misidentifications and the targeting of marginalized communities. Consider Randal Reid, for example. He was wrongfully arrested based on faulty Louisiana facial recognition technology, despite never having set foot in the state. The false match cost him his freedom, his dignity, and thousands of dollars in legal fees. That misidentification happened based on a still image run through a facial recognition search in an investigation; the Project NOLA real-time surveillance system supercharges the risks.
“We cannot ignore the real possibility of this tool being weaponized against marginalized communities, especially immigrants, activists, and others whose only crime is speaking out or challenging government policies. These individuals could be added to Project NOLA’s watchlist without the public’s knowledge, and with no accountability or transparency on the part of the police departments
Police use to justify stops and arrests: Alerts are sent directly to a phone app used by officers, enabling immediate stops and detentions based on unverified purported facial recognition matches.
hendrik@palaver.p3x.de 3 weeks ago
I'm looking more for large-scale quantitative numbers. I mean one destroyed life us really bad. But they could argue they'd have saved 30 lives in turn, and then we'd need to discuss how to do the maths on that...
AcidicBasicGlitch@lemm.ee 3 weeks ago
Yeah, I am not sure but hopefully somebody has the numbers. That is usually the argument for trampling the constitution.
Hard to say how much has been actually documented bc they’ve been doing a lot of this stuff off record.
They potentially saved 30000 lives locking up 100 people for crimes committed by somebody else in a states they’ve never been to and we might not even know about it
hendrik@palaver.p3x.de 3 weeks ago
Oh well, some people in the USA have a really "interesting" relationship with their own constitution these days...
ScoffingLizard@lemmy.dbzer0.com 3 weeks ago
They accurately recognized my face from a picture that was over ten years old. I no longer look like I did in the picture. This is going to get bad now that our fascist state authoritarians are everywhere jerking off with their riches and superiority. They have this tech and will do terrible things with it. Impending doom…
hendrik@palaver.p3x.de 3 weeks ago
I'm not very surprised. I think even old-school face recognition does things like measure distance between your eyes, nose etc, and stuff like that (your skull) doesn't change a lot during 10 years of adulthood. The real danger is that they connect all of that information. And as you said, it's everywhere these days, they have a lot of sources and -of course- it's in the wrong hands.