I read the headline and said oh come on. One paragraph in and that turned to what in the absolute fuck.
Spain sentences 15 schoolchildren over AI-generated naked images
Submitted 4 months ago by QuantumSpecter@lemmy.world to technology@lemmy.world
Comments
SnokenKeekaGuard@lemmy.dbzer0.com 4 months ago
Zak@lemmy.world 4 months ago
Are you surprised by teenage boys making fake nudes of girls in their school? I’m surprised by how few of these cases have made the news.
I don’t think there’s any way to put this cat back in the bag. We should probably work on teaching boys not to be horrible.
cyberpunk007@lemmy.ca 4 months ago
I’m not sure you can teach boys not to be horny teenagers 😜
pennomi@lemmy.world 4 months ago
There are always two paths to take - take away all of humanity’s tools or aggressively police people who abuse them. No matter the tool (AI, computers, guns, cars, hydraulic presses) there will be somebody who abuses it, and for society to function properly we have to do something about the delinquent minority of society.
Evotech@lemmy.world 4 months ago
Teenagers are literally retarded. Like their reasoning centers are not developed and they physically cannot think. There’s no way to teach that
BruceTwarzen@lemm.ee 4 months ago
It’s like these x-ray apps that obviously didn’t work but promoted to see all the women naked. Somehow that was very cool and no one cared. Suddenly there is something that kinda works and everyone is shocked.
IllNess@infosec.pub 4 months ago
They are releasing stories like this to promote the new that requires adults to login to pornsites and to limit their use of it.
kibiz0r@midwest.social 4 months ago
“Absolutely no way to prevent this”, says internet full of banners offering to “Undress your classmates now!”
“Tools are just tools, and there’s no sense in restricting access to
undress_that_ap_chemistry_hottie.exe
because it wouldn’t prevent even a single case of abuse and would also destroy every legitimate use of any computer anywhere”, said user deepfake-yiff69@lemmy.dbzer0.comspamfajitas@lemmy.world 4 months ago
It’s possible I just haven’t come across those types of comments you’re making fun of, but I usually just see people making the case that we don’t need new, possibly overreaching, legislation to handle these situations. They want to avoid a disingenuous “think of the children” kind of situation.
a youth court in the city of Badajoz said it had convicted the minors of 20 counts of creating child abuse images and 20 counts of offences against their victims’ moral integrity
I’m not familiar with their legal system but I would be willing to bet the crimes they’ve committed were already illegal under existing laws.
afraid_of_zombies@lemmy.world 4 months ago
Well you certainly managed to get attention to your comment
autotldr@lemmings.world [bot] 4 months ago
This is the best summary I could come up with:
A court in south-west Spain has sentenced 15 schoolchildren to a year’s probation for creating and spreading AI-generated images of their female peers in a case that prompted a debate on the harmful and abusive uses of deepfake technology.
Police began investigating the matter last year after parents in the Extremaduran town of Almendralejo reported that faked naked pictures of their daughters were being circulated on WhatsApp groups.
Each of the defendants was handed a year’s probation and ordered to attend classes on gender and equality awareness, and on the “responsible use of technology”.
Under Spanish law minors under 14 cannot be charged but their cases are sent to child protection services, which can force them to take part in rehabilitation courses.
In an interview with the Guardian five months ago, the mother of one of the victims recalled her shock and disbelief when her daughter showed her one of the images.
“Beyond this particular trial, these facts should make us reflect on the need to educate people about equality between men and women,” the association told the online newspaper ElDiario.es.
The original article contains 431 words, the summary contains 181 words. Saved 58%. I’m a bot and I’m open source!
Zeratul@lemmus.org 4 months ago
What does this have to do with the equality of men and women? Girls are more at risk of this kind of abuse? That’s a good point, but it’s not brought up here. This parent is trying to make something political that is simply not. Not that gender equality should be political in the first place.
0laura@lemmy.world 4 months ago
There’s a good chance that these behaviors originated from misogyny/objectifying women.
Sensitivezombie@lemmy.zip 4 months ago
Why not also go after these software companies for allowing such images to be generated. i.e allowing AI generated images of naked bodies on uploaded images to real people.
Duamerthrax@lemmy.world 4 months ago
How? How could you make an algorithm that correctly identified what nude bodies look like? Tumblr couldn’t differentiate between nudes and sand dunes back when they enforced their new policies.
KairuByte@lemmy.dbzer0.com 4 months ago
This sounds great, but it’s one of those things that is infinitely easier to say than do. You’re essentially asking for one of two things: Manual human intervention for every single image uploaded, or “the perfect image recognition system.” And honestly, the first is fraught with its own issues, and the second does not exist.
autonomoususer@lemmy.world 4 months ago
Next they’ll ban E2EE and libre software.
0x0@programming.dev 4 months ago
Under Spanish law minors under 14 cannot be charged
What about the parents?
vane@lemmy.world 4 months ago
I think parents under 14 also cannot be charged.
FoolHen@lemmy.world 4 months ago
Ahh the old
redditLemmy switcharoo
rustyfish@lemmy.world 4 months ago
WTF?!
sigmaklimgrindset@sopuli.xyz 4 months ago
Spain is a pretty Catholic country, and even if religious attendance is dropping off, the ingrained beliefs can still remain. Madonna/Whore dichotomy still is very prevalent in certain parts of society there.
leftzero@lemmynsfw.com 4 months ago
Hasn’t been in decades. No one in Spain gives a flying fuck about religion. Even the ones that go on processions and whatnot only do it because it’s traditional, and because they like dressing up and wearing silly hats.
The kids didn’t tell because they’re kids. They thought they’d be blamed for it because kids get blamed for everything, and because they know their parents don’t understand how generative “AI” works and would believe they were actually taking naked pictures and circulating them themselves.
cyberpunk007@lemmy.ca 4 months ago
Psychology 101.
todd_bonzalez@lemm.ee 4 months ago
Welcome to Christianity.
If a man sexually exploits a woman, it’s the woman’s fault for leading him astray.
This is how women are treated in deeply Christian communities.
Those women fear stepping forward to report assault or abuse because there are many in their community that will condemn them for it.
CybranM@feddit.nu 4 months ago
Welcome to religion at large