It’s also potentially skipping some of the parts that should be looked at. It depends on the training set.
themurphy@lemmy.world 9 months ago
ITT: People who are scared of things they don’t understand, which in this case is AI.
In this case, the “AI” program is nothing more than pattern recognition software setting a timestamp where it believes there’s something to be looked at. Then an officer can take a look.
It saves so much time, and it filters out anything irrelevant. But be careful because it’s labelled “AI”. Scarry.
Killing_Spark@feddit.de 9 months ago
fluxion@lemmy.world 9 months ago
It’s not that AI is scary, it’s that AI is dumb as fuck.
Voroxpete@sh.itjust.works 9 months ago
This is an astonishingly bad take.
Almost every AI system is a black box. Even if you open source the code and the training data, it’s almost impossible to know anything about the current state of a machine learning model.
So the entire premise here is that a completely unaccountable system - whose decisions are basically impossible to understand or scrutinize - gets to decide what data is or isn’t relevant.
When an AI says “No crime spotted here”, who gets to even know that it did that? If a human is reviewing all of the footage, then why have the AI? You’re doing the same amount of human work anyway. So as soon as you introduce this system, you remove a huge amount of human oversight, and replace it with decisions that dramatically affect human lives - that could potentially be life or death if it’s the difference between a bad cop being taken off the street or not - being made by a completely unaccountable system.
Whose to say if the training data fed into this system results in it, say, becoming effectively blind to police violence against black people?
Misconduct@lemmy.world 9 months ago
It’s not impossible to understand or scrutinize. They give it specific things to look for. It does what it’s told. You can make the argument that ANY tool used by the police will be misused. AI isn’t special for that by any means. It’s not like we bother to hold anyone accountable for anything else anyway. Maybe the AI will be less biased
It’s definitely not doing the same work as a human if humans are spared sifting through hours upon hours of less useful footage. I’m sure they’re testing it etc. Nobody goes all in on this stuff. Really, you guys can be so very dramatic lol