Comment on X’s deepfake porn feature clearly violates app store guidelines. Why won’t Apple and Google pull it?
ILikeTraaaains@lemmy.world 1 week agoIn some places it is CSAM and in others it is being working into convert into law.
I think the issues are:
- It can pass as real
- Unlike run-of-the-mill cartoon porn, real photos of children (even not CSAM) are involved, either in the training or as input for the generation