this is sorta a problem in regular porn. Im not sure if the acting improved but sometimes im turned off because im not sure if the acts are not in some way cohereced. Especially given some of the stuff recently with modeling things were they take their passports and shit.
Comment on AI-generated child sexual abuse images could flood the internet. A watchdog is calling for action
BombOmOm@lemmy.world 1 year ago
Art depicting minors in sexual situations has been deemed part of free speech. It’s why, at least in the US, you don’t have to worry about the anime girl that’s 17 landing you in prison on child porn charges. The reasoning was there was no victim, the anime girl is not sentient, therefore this falls under free speech.
I suspect a similar thing will happen with this. As long as it is not depicting a real person, the completely invented person is not sentient, there is no victim, this will fall under free speech. At least in the US.
However, it is likely a very, very bad idea to have any photo-realistic art of this manner, as it may not be clear to authorities if it is from AI or if there is in fact a person you are victimizing.
HubertManne@kbin.social 1 year ago
gregorum@lemm.ee 1 year ago
Yeah, I get turned off by porn even if the actors don’t seem all that into it. “Possibly coerced” sets off alarms, although I don’t really run across that hardly ever.
CptBread@lemmy.world 1 year ago
Anything that looks realistic should be illegal if you ask me as otherwise it would become harder to prosecuting real child porn. “Oh that picture is just modified with AI” could be hard to dissprove…
Supermariofan67@programming.dev 1 year ago
Exactly that has already been tried, and struck down by the supreme Court in Ashcroft v. Free Speech Coalition. It turns out, porn of people over 18 very often looks the same as porn of people under 18, therefore such a law bans a considerable amount of legal adult content.
BrianTheeBiscuiteer@lemmy.world 1 year ago
Possession of CSAM that’s 20 years old (i.e. the subject is now adult) or even 100 years old (i.e. the subject is likely deceased) is not legal. You don’t have to pay for it or create it, just possess it. Yeah, they’ll find a way to prosecute for images of non-existent children.
fubo@lemmy.world 1 year ago
Deepfakes of an actual child should be considered defamatory use of a person’s image; but they aren’t evidence of actual abuse the way real CSAM is.
Remember, the original point of the term “child sexual abuse material” was to distinguish images/video made through the actual abuse of a child, from those not involving actual abuse – such as erotic Harry Potter fanfiction, drawings from imagination, and the like.
Purely fictional depictions, not involving any actual child being abused, are not evidence of a crime. Even deepfake images depicting a real person, but without their actual involvement, are a different sort of problem from actual child abuse. (And should be considered defamatory, same as deepfakes of an adult.)
But if a picture does not depict a crime of abuse, and does not depict a real person, it is basically an illustration, same as if it was drawn with a pencil.
Uranium3006@kbin.social 1 year ago
although that distinction lasted about a week before the same bad actors who cancel people over incest fanfic started calling all the latter CSEM too
fubo@lemmy.world 1 year ago
As a sometimes fanfic writer, I do notice when fascists attack sites like AO3 under a pretense of “protecting children”, yes.
Uranium3006@kbin.social 1 year ago
And it's usually fascists, or at least people who may not consider themselves as such but think and act like fascists anyways.
Pyro@pawb.social 1 year ago
Add in an extra twist. Hopefully if the sickos are at least happy with AI stuff they won’t need “real”
Sadly, a lot of it does evolve from wanting to “watch” to wanting to do
hoshikarakitaridia@sh.itjust.works 1 year ago
This is the part where I disagree and I would love ppl to prove me wrong. Because whether this is true or false, it will probably be the deciding factor in allowing or restricting “artificial CSAM”.
topinambour_rex@lemmy.world 1 year ago
Have you got some source about this ?
fubo@lemmy.world 1 year ago
Some actually fetishize causing suffering.
JohnEdwa@sopuli.xyz 1 year ago
Some people are sadists and rapists, yes, regardless of what age group they’d want to do it with. The amount of rape fantasy porn on the internet is rather staggering.