Comment on A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It
Devial@discuss.online 4 days agoDid you even read the article ? The dude reported it anonymously, to a child protection org, not google, and his account was nuked as soon as he unzipped the data, because the content was automatically flagged.
Google didn’t even know he reported this.
Whostosay@sh.itjust.works 3 days ago
It seems they did react to it though
Devial@discuss.online 3 days ago
They didn’t react to anything. The automated system (correctly) flagged and banned the account for CSAM, and as usual, the manual ban appeal sucked ass and didn’t do what it’s supposed to do. This is barely news worthy. The real headline should be about how hundreds of CSAM images were freely available and sharable from this data set.
Whostosay@sh.itjust.works 3 days ago
An automatic reaction is a reaction
Devial@discuss.online 3 days ago
They reacted to the presence of CSAM. It had nothing whatsoever to do with it being contained in an AI training dataset, as the comment I originally replied to seeks to think.