Again, what you’re saying isn’t relevant to Lemmy at all. Please elaborate how would a graphics card on some random server help protect actual victims?
Comment on I just developed and deployed the first real-time protection for lemmy against CSAM!
joshuaacasey@lemmy.world 1 year agofalse positives not only leads to unnecessary censorship, but it also wastes resources that would be better used to protect *ACTUAL victims and children (although, the optimal solution is protecting people before any harm is done so that we don’t even need these “band-aid” solutions for reacting afterward)
xeddyx@lemmy.nz 1 year ago
mojo@lemm.ee 1 year ago
Unnecessary censorship is fine when it’s clearly a underaged person. You don’t need to check their ID to tell if it’s CSAM, and you don’t need to as well with generated child stuff. If you want to debate it’s legality, that’s a diff conversation, but even an AI generated version is enough to mentally scar the viewer, so there is still harm being done.
joshuaacasey@lemmy.world 1 year ago
an imaginary person has no ID because an imaginary person doesn’t exist. Why do you care about protecting imaginary characters instead of protecting actual real existing human beings?