Comment on The Tech Company Bringing Surveillance Dystopia to Your Town

AcidiclyBasicGlitch@sh.itjust.works ⁨1⁩ ⁨week⁩ ago

Beyond controversy around the Texas self-managed abortion case, Flock has had to respond to evidence that local law enforcement agencies have used their data to assist Immigration and Customs Enforcement. It now has offered assurances that jurisdictions proactively banning data sharing related to immigration status or abortion seeking will be excluded from national searches, as long as the local yahoo with tactical undershorts is dumb enough to put “ICE” or “abortion” in the required reason field.

But it turns out that once you’ve built a massive distributed surveillance network, it’s hard to rein in its use. The state of Washington explicitly bans sharing data or equipment with federal officers for the purpose of immigration enforcement, yet the University of Washington found dozens of examples of exactly that. Some local departments explicitly opened up their Flock data to the feds despite the state law; others had their information siphoned off without their knowledge via an unspecified technological error.

The university study and an investigation by 404 Media found another category of information sharing that also subverted state attempts to fend off immigration overreach: federal officers just asking really nice if the local guy could run a search on their behalf and the local guy happened to use “ICE” or “ICE warrant” or “illegal immigration” in the local search (tactical undies recognizes tactical undies, you know?). Worth noting: A local officer well informed about jurisdictional data-sharing limitations would just not enter “ICE” as the reason for the search, and we have no idea how many of those cannier cops there are.

We have this built in safety net that makes every user list the reason they used the database.

Reason for search: Not ICE

Checks out.

Already terrified? It gets worse: Flock is turning over more and more of its monitoring to AI, a feature that Flock (and the entire technology-media industrial complex) sells as a neutral efficiency. But the problem with AI is how deeply human it really is—trained on biased data, it can only replicate and amplify what it already knows. Misogyny and white supremacy are built into surveillance DNA, and using it to search for women seeking abortions or any other suspected “criminal” can only make the echo chamber more intense.

This month, an AI-powered security system (not Flock, surprisingly) tossed out an alarm to a school resource officer, and he called the police to the scene of a Black teenager eating chips. The teen described “eight cop cars that came pulling up to us [and] they started walking toward me with guns.” You can fault the resource officer for not clocking the chip bag; at least we know the point of failure.

source
Sort:hotnewtop