Admins are actively looking into solutions, nobody wants that stuff stored on their server, and there’s a bunch of legal stuff you must do when it happens.
One of the problems is the cost of compute power for running programs detecting CSAM in pictures before uploading, making it not viable for many instances. Lemmy.world is moving towards only allowing images hosted via whitelisted sites I think.
misk@sopuli.xyz 9 months ago
Whatever is done to fight spam should be useful in fighting CSAM too. Latest “AI” boom could prove lucky for non-commercial social networks as content recognition is something that can leverage machine learning. Obviously it’s a significant cost so pitching in will have to be more common in covering running costs.