Comment on Catbox.moe got screwed đż
Xanza@lemm.ee â¨2⊠â¨weeks⊠ago
The photos perfectly elaborate, though⌠They didnât really get screwed, someone uploaded âanimated child sexual abuse photos.â Which is not only illegal but against the TOS of Patreon.
Not really sure how this equates to them âbeing screwed.â Theyâre responsible for what people upload to their host which is why public uploads like this without restriction arenât a great idea.
chemical_cutthroat@lemmy.world â¨2⊠â¨weeks⊠ago
Yeah, it really comes back to the idea of if a file hosting service is responsible for what the user uploads, which is an argument that has been going on since the beginning of the internet. Ultimately, yes, I think they are. I think you have to actively moderate what is uploaded, but on top of that, there has to be swifter and stricter punishment for those that do upload things that are against TOS and/or illegal. If someone is uploading CSAM, then law enforcement needs to go after them. Maybe theyâll actually do something people appreciate, instead of killing minorities.
southsamurai@sh.itjust.works â¨2⊠â¨weeks⊠ago
Shit, I down voted you for whining about down votes.
chemical_cutthroat@lemmy.world â¨2⊠â¨weeks⊠ago
How Libertarian of you.
merde@sh.itjust.works â¨2⊠â¨weeks⊠ago
they gave a reason for their downVote so no, theyâre not assumed to be a libertarian
NateNate60@lemmy.world â¨2⊠â¨weeks⊠ago
Youâre being downvoted because your assertion that hosts are responsible for what users upload is generally false.
47 USC § 230c, a.k.a. Communications Decency Act 1996 § 230
deegeese@sopuli.xyz â¨2⊠â¨weeks⊠ago
Weâre talking Patreon rules not US law.
chemical_cutthroat@lemmy.world â¨2⊠â¨weeks⊠ago
Which is exactly why I said TOS and not the US laws. I donât really agree with the laws here either, because they create a safe harbor for illegal ends, but I understand that it is a lot easier, and arguably better, to self-police the content. That is what Patreon is doing. They view it as a violation of their TOS to generate revenue on a site that knowingly and willingly hosts CSAM. Iâm with Patreon on this one. This wasnât the first offence, and there is no way that the person that runs the site doesnât know that material is on there. Pleading ignorant isnât going to work. Running anonymous file hosting, no matter how good your intentions, is going to bring out the worst of the internet, guaranteed. If you can somehow get around that logic, youâve got a bright future with the NRA.
Xanza@lemm.ee â¨2⊠â¨weeks⊠ago
This is not only incorrect (this particular law doesnât apply here), but I can easily prove it beyond any shadow of a doubt.
backpage.com was shutdown despite their willingness to comply with the law because they were found to âfacilitateâ CSAM. Omegle was also temporarily shutdown for the same reasons. There have also been quite literally dozens of prosecutions of website admins on the dark web for offering a platform for CSAM despite them arguing in court that they had no control over what their users uploaded and quickly moderated the content when discovered. In the end none of it mattersâas a provider of a service you are required to make it difficult to share CSAM, not just comply with the law when someone catches you with your pants down.
It bedevils me that people are so laissez-faire about literal fucking CPâAI generated or not.
And in spite of literally all of that, none of this has anything to do with US law. Itâs Patron policy. They donât want to service someone who constantly has issues with CSAM, and they have every right not to offer their services to catbox.
Xanza@lemm.ee â¨2⊠â¨weeks⊠ago
Thereâs no opinion to be had. You are absolutely morally and legally responsible for what your users upload. Period.
ArchRecord@lemm.ee â¨2⊠â¨weeks⊠ago
Iâll gladly give you a reason. Iâm actually happy to articulate my stance on this, considering how much I tend to care about digital rights.
Services that host files should not be held responsible for what users upload, unless:
Because holding services responsible creates a whole host of negative effects. Hereâs some examples:
This isnât just my opinion either. Digital rights organizations such as the Electronic Frontier Foundation have talked at length about similar policies before. To quote them:
Now, to address the rest of your comment, since I donât just want to focus on the beginning:
Catbox does, and as previously mentioned, often at a much higher rate than other services, and at a comparable rate to many services that have millions, if not billions of dollars in annual profits that could otherwise be spent on further moderation.
The problem isnât necessarily the speed at which people can be reported and punished, but rather that the internet is fundamentally harder to track people on than real life. Itâs easy for cops to sit around at a spot they know someone will be physically distributing illegal content at in real life, but digitally, even if you can see the feed of all the information passing through the service, a VPN or Tor connection will anonymize your IP address in a manner that most police departments wonât be able to track, and most three-letter agencies will simply have a relatively low success rate with.
Thereâs no good solution to this problem of identifying perpetrators, which is why platforms often focus on moderation over legal enforcement actions against users so frequently. It accomplishes the goal of preventing and removing the content without having to, for example, require every single user of the internet to scan an ID (and also magically prevent people from just stealing other peopleâs access tokens and impersonating their ID)
I do agree, however, that we should probably provide larger amounts of funding, training, and resources, to divisions whoâs sole goal is to go after online distribution of various illegal content, primarily that which harms children, because itâs certainly still an issue of there being too many reports to go through, even if many of them will still lead to dead ends.
I hope that explains why making file hosting services liable for user uploaded content probably isnât the best strategy. I hate to see people with good intentions support ideas that sound good in practice, but in the end just cause more untold harms, and I hope you can understand why I believe this to be the case.