Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Catbox.moe got screwed 😿

⁨110⁊ ⁨likes⁊

Submitted ⁨⁨2⁊ ⁨days⁊ ago⁊ by ⁨tal@lemmy.today⁊ to ⁨technology@lemmy.world⁊

https://blog.catbox.moe/post/785233399498555392/important-catbox-needs-your-help

source

Comments

Sort:hotnewtop
  • tal@lemmy.today ⁨2⁊ ⁨days⁊ ago

    Just noticed this on !technology@beehaw.org, which lemmy.world is defederated with. As I’ve seen a number of people posting using catbox.moe to host content posted on here before, thought it’d be of broader interest.

    source
  • db0@lemmy.dbzer0.com ⁨2⁊ ⁨days⁊ ago

    Crossposting from the other thread


    I wonder what kind of of csam detection they have. If they’re only relying on hash matching, they’ve gonna get fucked from novel genai csam. This is why stuff like the fedi-safety exists which they could use as well

    source
    • NateNate60@lemmy.world ⁨2⁊ ⁨days⁊ ago

      Hash matching is really easy to get around. Literally modify 1 bit of the image or just re-encode the video and you’ve gotten around it.

      source
      • Warl0k3@lemmy.world ⁨2⁊ ⁨days⁊ ago

        Yes, it’s the bare minimum precaution you can implement. But at the same time, it’s the bare minimum precaution you can implement. There’s really no excuse for not doing it, and it catches a shocking number of images.

        source
      • tomsh@lemmy.world ⁨2⁊ ⁨days⁊ ago

        blog.cloudflare.com/the-csam-scanning-tool/#fuzzy…

        source
      • catloaf@lemm.ee ⁨2⁊ ⁨days⁊ ago

        They’ll be using perceptual hashes, not file hashes.

        source
  • net00@lemm.ee ⁨2⁊ ⁨days⁊ ago

    Lots of lemmy content is hosted in catbox, hopefully they get it sorted. I sent a standalone donation, I prefer to host my own images so my usage doesn’t warrant a monthly subscription.

    Hopefully some of those heavy users pitch in.

    source
    • SkyezOpen@lemmy.world ⁨2⁊ ⁨days⁊ ago

      Catbox has been super inconsistent in whether it even loads for the past year or so. Last few months it hasn’t worked at all. Thought it might be my app but either way I’d be glad to see lemmy move away from it.

      source
  • Xanza@lemm.ee ⁨2⁊ ⁨days⁊ ago

    The photos perfectly elaborate, though… They didn’t really get screwed, someone uploaded “animated child sexual abuse photos.” Which is not only illegal but against the TOS of Patreon.

    Not really sure how this equates to them “being screwed.” They’re responsible for what people upload to their host which is why public uploads like this without restriction aren’t a great idea.

    source
    • chemical_cutthroat@lemmy.world ⁨2⁊ ⁨days⁊ ago

      Yeah, it really comes back to the idea of if a file hosting service is responsible for what the user uploads, which is an argument that has been going on since the beginning of the internet. Ultimately, yes, I think they are. I think you have to actively moderate what is uploaded, but on top of that, there has to be swifter and stricter punishment for those that do upload things that are against TOS and/or illegal. If someone is uploading CSAM, then law enforcement needs to go after them. Maybe they’ll actually do something people appreciate, instead of killing minorities.

      source
      • southsamurai@sh.itjust.works ⁨2⁊ ⁨days⁊ ago

        Shit, I down voted you for whining about down votes.

        source
        • -> View More Comments
      • NateNate60@lemmy.world ⁨2⁊ ⁨days⁊ ago

        You’re being downvoted because your assertion that hosts are responsible for what users upload is generally false.

        (1) Treatment of Publisher or Speaker.—No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

        (2) Civil Liability.—No provider or user of an interactive computer service shall be held liable on account of—

        (A) any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

        (B) any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in [subparagraph (A)].

        47 USC § 230c, a.k.a. Communications Decency Act 1996 § 230

        source
        • -> View More Comments
      • Xanza@lemm.ee ⁨2⁊ ⁨days⁊ ago

        Ultimately, yes, I think they are.

        There’s no opinion to be had. You are absolutely morally and legally responsible for what your users upload. Period.

        source
      • ArchRecord@lemm.ee ⁨1⁊ ⁨day⁊ ago

        I’ll gladly give you a reason. I’m actually happy to articulate my stance on this, considering how much I tend to care about digital rights.

        Services that host files should not be held responsible for what users upload, unless:

        1. The service explicitly caters to illegal content by definition or practice (i.e. the if the website is literally titled uploadyourcsamhere[.]com then it’s safe to assume they deliberately want to host illegal content)
        2. The service has a very easy mechanism to remove illegal content, either when asked, or through simple monitoring systems, but chooses not to do so (catbox does this, and quite quickly too)

        Because holding services responsible creates a whole host of negative effects. Here’s some examples:

        • Someone starts a CDN and some users upload CSAM. The creator of the CDN goes to jail now. Nobody ever wants to create a CDN because of the legal risk, and thus the only providers of CDNs become shady, expensive, anonymously-run services with no compliance mechanisms.
        • You run a site that hosts images, and someone decides they want to harm you. They upload CSAM, then report the site to law enforcement. You go to jail. Anybody in the future who wants to run an image sharing site must now self-censor to try and not upset any human being that could be willing to harm them via their site.
        • A social media site is hosting the posts and content of users. In order to be compliant and not go to jail, they must engage in extremely strict filtering, otherwise even one mistake could land them in jail. All users of the site are prohibited from posting any NSFW or even suggestive content, (including newsworthy media, such as an image of bodies in a warzone) and any violation leads to an instant ban, because any of those things could lead to a chance of actually illegal content being attached.

        This isn’t just my opinion either. Digital rights organizations such as the Electronic Frontier Foundation have talked at length about similar policies before. To quote them:

        “When social media platforms adopt heavy-handed moderation policies, the unintended consequences can be hard to predict. For example, Twitter’s policies on sexual material have resulted in posts on sexual health and condoms being taken down. YouTube’s bans on violent content have resulted in journalism on the Syrian war being pulled from the site. It can be tempting to attempt to “fix” certain attitudes and behaviors online by placing increased restrictions on users’ speech, but in practice, web platforms have had more success at silencing innocent people than at making online communities healthier.”

        Now, to address the rest of your comment, since I don’t just want to focus on the beginning:

        I think you have to actively moderate what is uploaded

        Catbox does, and as previously mentioned, often at a much higher rate than other services, and at a comparable rate to many services that have millions, if not billions of dollars in annual profits that could otherwise be spent on further moderation.

        there has to be swifter and stricter punishment for those that do upload things that are against TOS and/or illegal.

        The problem isn’t necessarily the speed at which people can be reported and punished, but rather that the internet is fundamentally harder to track people on than real life. It’s easy for cops to sit around at a spot they know someone will be physically distributing illegal content at in real life, but digitally, even if you can see the feed of all the information passing through the service, a VPN or Tor connection will anonymize your IP address in a manner that most police departments won’t be able to track, and most three-letter agencies will simply have a relatively low success rate with.

        There’s no good solution to this problem of identifying perpetrators, which is why platforms often focus on moderation over legal enforcement actions against users so frequently. It accomplishes the goal of preventing and removing the content without having to, for example, require every single user of the internet to scan an ID (and also magically prevent people from just stealing other people’s access tokens and impersonating their ID)

        I do agree, however, that we should probably provide larger amounts of funding, training, and resources, to divisions who’s sole goal is to go after online distribution of various illegal content, primarily that which harms children, because it’s certainly still an issue of there being too many reports to go through, even if many of them will still lead to dead ends.

        I hope that explains why making file hosting services liable for user uploaded content probably isn’t the best strategy. I hate to see people with good intentions support ideas that sound good in practice, but in the end just cause more untold harms, and I hope you can understand why I believe this to be the case.

        source
  • tyler@programming.dev ⁨2⁊ ⁨days⁊ ago

    To both host 130+ TB of data, and push 1 petabyte of traffic per month on any cloud provider would be, quite frankly, impossible.

    Uhhh I’m sorry but catbox isn’t some special case here. Cloud providers are literally built to do this. Netflix literally hosted off of AWS for years. Also using cloudflare or another provider to reduce your needed bandwidth would make the costs sooooo much cheaper. And it probably would have stopped patreon from deleting them off of the platform since CF has CSAM detection and quarantining.

    There’s plenty to be said about hosting things yourself, both good and bad, but in this case routing through CF really would have stopped a lot of this and reduced costs massively.

    source
  • Feathercrown@lemmy.world ⁨2⁊ ⁨days⁊ ago

    catbox is blocked by my ISP anyways due to the csam issue

    source
  • secret300@lemmy.sdf.org ⁨2⁊ ⁨days⁊ ago

    I’d love to read it but Tumblr is fucking ass.

    This pop up won’t go away. I’m still gonna donate but fucking christ stop using Tumblr

    Image

    source
  • Alphane_Moon@lemmy.world ⁨2⁊ ⁨days⁊ ago

    What the hell is catbox.moe?

    Isn’t “moe” a weird anime/manga genre with hints of pedo motifs?

    source