Comment on lemm.ee plans for mitigating image upload abuse

rarely@sh.itjust.works ⁨1⁩ ⁨year⁩ ago

Lemmy admins need to do whatever it is they can to handle CSAM if and when it arises. Users need to be understanding in this because as I’ve argued in other threads, CSAM itself poses a threat to the instance itself, as it poses a threat to the admins if they cannot clean up the material in a timely manner.

This is going to likely get weird for a bit, including but not limited to:

I just want folks to know that major sites like reddit and facebook usually have (not very well) paid teams of people who’s sole job is to remove this material. Lemmy has overworked volunteers. Please have patience, and if you feel like arguing about why any of the methods I mentioned above are BS or have any questions reply to this message.

I’m not an admin, but I’m planning on being one and I’m sort of getting a feel for how the community responds to this sort of action. We don’t get to see it a lot in major social media sites because they aren’t as transparent (or as understaffed) as lemmy instances are.

source
Sort:hotnewtop