Comment on A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It
Devial@discuss.online 3 days agoThey didn’t get mad. Did you even read my comment ?
Comment on A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It
Devial@discuss.online 3 days agoThey didn’t get mad. Did you even read my comment ?
cupcakezealot@piefed.blahaj.zone 3 days ago
they obviously did if they banned him for it; and if they’re training on csam and refuse to do anything about it then yeah they have a connection to it.
Devial@discuss.online 3 days ago
Also, the data set wasn’t hosted, created, or explicitly used by Google in any way.
It was a common data set used in academic papers on training nudity detectors.
Did you seriously just read the headline, guess what happened, and are now arguing based on that guess that I, who actually read the article, am wrong about it’s content ? Because that’s sure what it feels like reading your comments…
MangoCats@feddit.it 3 days ago
Google doesn’t ban for hate or feels, they ban by algorithm. The algorithms address legal responsibilities and concerns. Are the algorithms perfect? No. Are they good? Debatable. Is it possible to replace those algorithms with “thinking human beings” that do a better job? Also debatable, from a legal standpoint they’re probably much better off arguing from a position of algorithm vs human training.
Devial@discuss.online 3 days ago
So you didn’t read my comment did you ?
He got banned because Google’s automated monitoring system, entirely correctly, detected that the content he unzipped contained CSAM. It wasn’t even a manual decision to ban him.