i do not think you would find many people who actively support the use of artificial intelligence in the monitoring and moderating of hate speech or fascism. those things must be moderated and resisted by people who can be held accountable for mistakes or oversteps, not machines that can not be held accountable for anything.
Plebcouncilman@sh.itjust.works 1 day ago
I just want to point out a lot of you actively support this if it helps curb “hate speech” or fascists or nazis or whatever. But what can be used against the guilty may be used against the innocent so it is best that we do not allow it at all. Either all speech is free or none of it is, there’s no other way.
neoinvin@lemmy.zip 1 day ago
jjlinux@lemmy.zip 1 day ago
Or we could have a legislation that would punish the companies that run these bullshit systems AND the authorities that allow and use them when they flop, like in this case.
Hey, dreaming is still free (don’t know how much longer though).
SugarCatDestroyer@lemmy.world 1 day ago
How can I say this, if you only dream while sitting on the couch, then alas, everything will end sadly.
a_wild_mimic_appears@lemmy.dbzer0.com 23 hours ago
The problem is a societal one. lets see:
- The kids for bullying her for her tan
- The school boards implementing the surveillance
- The parents who allowed such surveillance in the first place
- The person screening what was flagged for not sending the school counselor to talk with the kid
- The person calling the cops
- The cops for arresting an 8th-grader and DOING A STRIP SEARCH AND KEEPING HER OVERNIGHT WTF instead of handing her over to her parents
This has gone through too many hands to even start blaming the companies. anyone in this chain had the opportunity to do the right thing (ok, maybe the teens didn’t, they don’t know any better). Noone did.
Surveillance shouldn’t be so pervasive, but i have no issue with e.g. surveillance in a prison, maybe in a hospital (not in the patient rooms, but to make sure noone steals the good stuff or to make sure no patients are lost in a service tunnel), at a border, inside of police stations to make sure that prisoner rights are upheld, military bases for obvious reasons and so on.
Your society is the issue, and therefor surveillance is everywhere except where it would be useful.
jjlinux@lemmy.zip 20 hours ago
You’re evidently an apologist for these crappy companies.
2 or 3 parents can’t do jack shit to avoid this, short of removing the kids from school and having them homeschooled.
There are 3 main factors that allow this shit to happen:
- Companies with absolutely no values and only focusing on revenue, which ends up creating these bullshit AI “systems” that are broken as hell (Chatgpt 5 anyone?)
- Lack of legislations to serve the people that voted for these legislators to live in a better society, but they choose to self-serve and allow the same shitty companies to do whatever they want AND sell their shit to institutions, like schools, law enforcement and such, so that they can get money from them
- Authorities that use these same broken “systems”, don’t even test them correctly, and take for granted that they will work because they are too fucking lazy to even care, such as schools (that have authority over our kids because we have allowed it) and law enforcement (if they can even still be called that).
There is no way to justify any of the 3 factors.
That there are parents that could be doing more to avoid this kind of shit? Absolutely. Will parents pushing for change do anything towards fixing it? Not if the other 3 factors don’t play their part as well.
Get fucking real.
frongt@lemmy.zip 1 day ago
Nope. It should be reviewed by a human, and the response should be proportionate.
EtherWhack@lemmy.world 1 day ago
Not just any human. It should be a board certified child psychologist. They would be one of the few who could recognize a legitimate threat/concern or bullying from a poor joke or a stressed-out kid just venting with an empty threat. And on a positive ‘hit’, should just be a visit from the counselor to see what’s going on. IMO, psychologists should also be the only ones allowed to look at any of the info as they should know how to keep private conversations private if intervention is unnecessary.
The idea of the software does show some merits, but it is way, way too underdeveloped and grossly misused to be of any use.