Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.
So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people “get used to” it.
Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal porn, and then it can put those together.
This is the same reason it can do something like Godzilla with Sailor Moon’s hair, not because it trained on images of Godzilla with Sailor Moon’s hair, but because it can combine those two separate things.
Only the real things are actual humans who have likely not consented to ever being in this database at all let alone having parts of their likeness being used for this horrific shit. There is no moral argument for this garbage:
I wouldn’t think it needs to have child porn in the training data to be able to generate it. It has porn as the data, it knows what kids look like, merge the two. I think that works for anything AI knows about, make this resemble this.
I also don’t agree with the killer games thing, but humans are very adaptable as a species.
Normally that’s a good thing, but in a case like this exposure to something shocking or upsetting can make it less shocking or upsetting over time (obviously not in every case). So, if AI is being used for something like this and being reported on isn’t it possible that people might slowly get desensitized to it over time?
jaschen@lemm.ee 4 days ago
I’m no pedo, but what you do in your own home and hurts nobody is your own thing.
reseller_pledge609@lemmy.dbzer0.com 4 days ago
Yes, but how is the AI making the images or videos? It has to be trained on SOMETHING.
So, regardless of direct harm or not, harm is done at some point in the process and it needs to be stopped before it slips and gets worse because people “get used to” it.
Mnemnosyne@sh.itjust.works 4 days ago
Ai can combine two things. It can train on completely normal pictures of children, and it can train on completely normal porn, and then it can put those together.
This is the same reason it can do something like Godzilla with Sailor Moon’s hair, not because it trained on images of Godzilla with Sailor Moon’s hair, but because it can combine those two separate things.
reseller_pledge609@lemmy.dbzer0.com 4 days ago
Fair enough. I still think it shouldn’t be allowed though.
RightEdofer@lemmy.ca 4 days ago
Only the real things are actual humans who have likely not consented to ever being in this database at all let alone having parts of their likeness being used for this horrific shit. There is no moral argument for this garbage:
Kusimulkku@lemm.ee 4 days ago
I wouldn’t think it needs to have child porn in the training data to be able to generate it. It has porn as the data, it knows what kids look like, merge the two. I think that works for anything AI knows about, make this resemble this.
reseller_pledge609@lemmy.dbzer0.com 4 days ago
That’s fair, but I still think it shouldn’t be accepted or allowed.
MonkderVierte@lemmy.ml 4 days ago
Old cp?
Ah, right, almost finally forgot the killer games rhetoric.
reseller_pledge609@lemmy.dbzer0.com 4 days ago
I also don’t agree with the killer games thing, but humans are very adaptable as a species.
Normally that’s a good thing, but in a case like this exposure to something shocking or upsetting can make it less shocking or upsetting over time (obviously not in every case). So, if AI is being used for something like this and being reported on isn’t it possible that people might slowly get desensitized to it over time?