Does that leave open a possible attack, in which the attacker can just fill up the server’s hard drive with AI-generated CSAM?
Comment on I just developed and deployed the first real-time protection for lemmy against CSAM!
ABluManOnLemmy@feddit.nl 1 year ago
Be careful with this though. I think I remember some jurisdictions require server owners not to delete CSAM and report it instead. Verify that you aren’t obligated to keep it before deleting it
explodicle@local106.com 1 year ago
ABluManOnLemmy@feddit.nl 1 year ago
I think that if, in good faith, the person is unable to accept more CSAM due to the fact that their hard drive is full, there isn’t an issue. The intent of the law is that, it someone knows something is CSAM, they need to report it. I don’t think the government is going to come hard on Lemmy server owners unwittingly receiving CSAM through federation (though they certainly would want them to report and take down the CSAM on their servers)
deFrisselle@lemmy.sdf.org 1 year ago
It’s not getting uploaded, so nothing to keep
That’s the point The Kiddy Porn never hits the server There might be an argument for the scanner cache to be saved for later reporting to authorities Thank is assuming the scanner also logs the account, ip, time, etc of the upload
xtremeownage@lemmyonline.com 1 year ago
There wouldn’t be anything to delete, as it would have never been saved with this.
shagie@programming.dev 1 year ago
www.law.cornell.edu/uscode/text/18/2258A
Check with a lawyer if blocking an upload that your server has access to because of suspected CSAM constitutes “actual knowledge or any facts or circumstances”.
xtremeownage@lemmyonline.com 1 year ago
HEY LOCAL PD OFFICE,
SOMEONE UPLOADED SOME POTENTIALLY CHILD PORN TO MY LEMMY INSTANCE.
No… I don’t have an IP for who uploaded it.
Sorry, I don’t know where it came from. It just got federated across the fediverse to me.
No… I don’t have the content either, it doesn’t get saved.
Sorry… I guess I really don’t have any details at all for you.
zoe@infosec.pub 1 year ago
hosting a lemmy instance in the us is a headache
shagie@programming.dev 1 year ago
NCMEC is very different than reporting to your local PD office.
obinice@lemmy.world 1 year ago
So the image never touches the server side, even in RAM, it always remains only on the client machine, and it’s checked there?
If so, then this could be a pretty neat tidy way to deal with this issue, otherwise the image is on the server, even if you “delete it real fast” or such, and I imagine then you’d still need to be in compliance with the law regarding saving and reporting it.
Deiv@lemmy.ca 1 year ago
Did you read the post? The image is sent to an endpoint that has a hosted AI solution that checks it
It 100% touches the server, it’s just not stored anywhere and gets blocked