I can’t say I entirely agree. I do think that they should be helped, but in a measured and rigorous way. None of this “let them find shit online that quells their needs”. Pedophilia, in the psychological profession, is viewed in a similar light to sexual orientations; of that the person I’m responding to is correct. It’s simply that they seem to be blind to nuance beyond that stance that they’re stuck.
AI pedophilia is certainly a very risky move for us to simply accept, when we don’t even have any data on how consumption of real or virtual CSAM impacts those who indulge in it, and to get that data would require us to do very unethical and likely illegal research as far as I can tell. The approach Kerfuffle@shi.tjust.works is suggesting is one that is naive and myopic in the most generous light; which is how I’m choosing to take it so as to not accuse them of something they may not be guilty of.
I’m also someone who’s extremely progressive, and while I can sympathize with people who have these urges and no true wish to act on them, I think it’s outright malicious to say that the solution is to simply allow them to exist with informal self-treatments based on online “common sense” idealism. Mental health support should absolutely be available and encouraged; part of that is making sure people are safe to disclose this stuff to medical professionals, but no part of that is just having this shit freely spread online.
I appreciate your measured and metered response. I think these are extremely tricky conversations to have, but important, especially with how technology is progressing.
Falmarri@lemmy.world 1 year ago
Tibert@compuverse.uk 1 year ago
Half the answers in this cursed thread. Like wtf is this thread.
Pedophilia IS AN ADDICTION!!! Fueling it with anything, even AI will worsen the ADDICTION!
Falmarri@lemmy.world 1 year ago
Tibert@compuverse.uk 1 year ago
What you don’t have any more arguments and so you resent to saying it’s stupid and insult?
Yours seems to be the one.