Which is already illegal in much of the US. Will it hold up in court? Maybe not but that doesn’t stop cops from demanding you take it off
Comment on It turns out that Juggalo makeup blocks facial recognition technology
favoredponcho@lemmy.zip 1 day ago
I think in the future everyone will wear masks like daft punk
gibson@sopuli.xyz 1 day ago
oatscoop@midwest.social 1 day ago
Makeup isn’t, though. We all just need to get down with the clown.
Brb, buying FIZZ 34.30 -0.26(0.75%) 🠳
some_kind_of_guy@lemmy.world 1 day ago
Hope so. Personally I’m pulling for a modular jumpsuit as well. Most of it will come as one piece, so no more worrying about pants and tops as separate items which have to be coordinated and can get lost. Socks/gloves/masks etc would be optional but will just snap onto the main bodysuit piece. Everything will be interchangeable, standardized and spares can be purchased anywhere. Different weights for different climates etc, but it’s all one platform, like the AR-15 of clothing.
Please vote for me in the upcoming primary. If I win I’m sending everyone a free suit. ~(Which you’ll need, because they’ll be mandatory) ~ VOTE FOR ME
zod000@lemmy.dbzer0.com 1 day ago
That would at least be fun, if not a bit uncomfortable long term.
AHemlocksLie@lemmy.zip 1 day ago
It won’t work alone for long in the age of AI. You won’t be tracked and identified by face alone. It’ll be a complex array of data points. Your face, your hair, your eye color if the cameras have the resolution, your height, your gait, your posture, your scars and injuries, your visible birth defects, whether you use mobility aids, the wireless devices emitting signals in your pockets, the list goes on and on. They’ll assemble dozens of data points and make it extremely difficult to falsify enough to avoid detection instead of just getting flagged as suspicious.
Landless2029@lemmy.world 23 hours ago
Even if you go full privacy focused with a locked down phone and a dumb car flock cameras are tracking your plate, scanners are tracking the tire sensors in your wheels and phones need IMEI tower connections so all that is logged.
AHemlocksLie@lemmy.zip 8 hours ago
God dammit, we can be tracked by the fucking tire sensors? Fucking hate this timeline…
Landless2029@lemmy.world 2 hours ago
Your car’s tire sensors could be used to track you
Fedizen@lemmy.world 1 day ago
Once we have a case where AI fabricated evidence is used to convict somebody it will also get much easier to dismiss a lot of this data. Imo, I still get ads for dogfood for a dog I don’t own.
AHemlocksLie@lemmy.zip 1 day ago
That won’t stop corporations and governments from surveiling. They’ll still collect highly accurate information about you. They may not trust public data, but they’ll still trust the systems they use to surveil. They’ll still be right.
Chakravanti@monero.town 15 hours ago
You forgot you seven sided dice.
GamingChairModel@lemmy.world 1 day ago
Will they actually devote the resources to try to pierce the anonymity of those handful of people? Everything we’ve seen about how tech companies operate is that they reach a threshold of “good enough for most cases” and don’t bother trying to optimize the edge cases. Collecting the billions of data points to try to use dozens of analysis techniques, and then having some kind of meta analysis on how to resolve disagreements between models, would be resource intensive beyond their own profit motives.
Someone who wants to defeat gait analysis with a different pair of shoes (heel height and sole thickness and back support affect how people walk), and wears a mask might lose the arms race if the tech companies choose to continue to improve the tech even after it’s already good enough.
I think it’s possible but not inevitable. Especially if there’s a financial reckoning for AI companies soon.
AHemlocksLie@lemmy.zip 1 day ago
Even if businesses are willing to settle for good enough, governments most certainly will NOT. Those attempting to evade detection will be those they’re most interested in identifying, which is why I mentioned that failure to successfully falsify will get you flagged as having attempted it and probably how. From a government’s perspective, the ones attempting to evade detection are the ones most likely to be criminals or, even worse in their eyes, rebels. Governments, especially authoritarian ones, will make sure the tech constantly pushes the boundaries of what’s possible, or at the very least defeats the vast majority of known evasion techniques.
Then, if business really has left the evaders unidentified, they’ll start adopting the tech from government. Better data with no R&D? Why wouldn’t they at that point? Governments might even subsidize it because it helps them spread the greater surveillance network.
qaeta@lemmy.ca 23 hours ago
Given how rapidly they’ve been trying to tighten their grip lately? Yes. They want to know everything about everyone all the time. And they don’t give two shits if we consent to it.
Chakravanti@monero.town 15 hours ago
There’s always making them wrong by being different every time.
JoeMontayna@lemmy.ml 1 day ago
Well they are building data centers like they want to have enough resources to do just that.
ITGuyLevi@programming.dev 1 day ago
The clothing they wear solves most of those. For the physical side (gait, height, etc) it’s a little harder but shoe’s have an effect on most of those (i.e. the round bottom shoes meant to help people work out just by walking wildly change a normal gait and posture).
For the devices though it could get fun. You could have a device mounted in the helmet that will pretend to be people you’ve passed, essentially just replaying the beacons (SSID broadcasts, etc) for the sake of a digital camouflage.
AHemlocksLie@lemmy.zip 1 day ago
For now. This cat and mouse game will continue on and on. We’ll develop evasion techniques, they’ll learn how to recognize and see through them. We’ll develop new ones again, they’ll learn them again. What about if you speak near a camera? It’ll learn to analyze voice and diction. Voice scrambler? AI is learning to descramble video, can probably learn it for voice, too. Your clothing style will become a data point and an expensive one to consistently falsify. The locations you’re seen at is suggestive. If you walk a dog, good fucking luck convincing it to help you falsify data for the AI monitors.
Still, this is predicated on the assumption that you can recognize and falsify enough of the data points. My point is that they will collect however many data points it takes to make it nigh impossible to get a failure to identify you or a false positive. And if it’s a false positive, we have to question the ethics of pinning your trail on some other random dude.
ITGuyLevi@programming.dev 1 day ago
I agree for the most part, but if we are all walking around in jumpsuits and helmets (Daft Punk style) and repeating the digital beacons of everyone else it seems like false positives are a skill issue for AI. Not too long ago I was watching a video about a guy that was a 100% match in the eyes of AI as someone that was trespassed by the casino. When the cops showed up and he presented his documents, the cops brought him to the station as they thought he must have given false ID when he was originally trespassed. He was eventually able to prove his innocence but the fact he was taken into custody because AI messed up makes me have no issue with people doing stuff to intentionally poison the data.
None of this matters in the present context though because just by wearing that you would be easily identifiable.