Open Menu
AllLocalCommunitiesAbout
lotide
AllLocalCommunitiesAbout
Login

Grok floods X with sexualized images of women and children: Grok generated an estimated 3 million sexualized images, including 23,000 of children in 11 days

⁨651⁩ ⁨likes⁩

Submitted ⁨⁨2⁩ ⁨days⁩ ago⁩ by ⁨Beep@lemmus.org⁩ to ⁨technology@lemmy.world⁩

https://counterhate.com/research/grok-floods-x-with-sexualized-images/

source

Comments

Sort:hotnewtop
  • JensSpahnpasta@feddit.org ⁨1⁩ ⁨hour⁩ ago

    Let’s be clear here: It is not Grok. Grok is a software developed by employees of Elon Musk that is capable of generating child porn and deep fakes of ordinary people. That software doesn’t have the saveguards to prevent this and was released to the public. Elon Musk, the whole leadership of X.com and their employees there were made aware that this is happening and did nothing for several days. So let’s not pretend that some “Grok” was doing it.

    source
  • Minimac@lemmy.ml ⁨1⁩ ⁨hour⁩ ago

    Grok must be outlawed, it’s truly sickening!

    source
  • voracitude@lemmy.world ⁨2⁩ ⁨days⁩ ago

    Limited liability companies are “limited” in the sense that there are limited on the responsibilities of the members of the corporation. The CEO can’t be held personality liable for the actions of the company, for example; their underlings could have been responsible and kept the leader in the dark.

    However, there’s this interesting legal standard wherein it is possible to “pierce the corporate veil” and hold corporate leadership accountable for illegal actions their company took, if you can show that by all reasonable standards they must or should have known about the illegal activity.

    Anyway Elon has been elbow-deep in the inner workings of Xitter for years now, but his own annotation, right? Really getting in there to tinker and build new stuff, like Grok and its image generation tools. Seems like he knows an awful lot about how that works. An awful lot.

    source
    • Zombiepirate@lemmy.world ⁨2⁩ ⁨days⁩ ago

      Image

      source
      • SatansMaggotyCumFart@piefed.world ⁨2⁩ ⁨days⁩ ago

        Oh man the last three years have been rough on him.

        Hopefully the next three are worse.

        source
        • -> View More Comments
    • snooggums@piefed.world ⁨1⁩ ⁨day⁩ ago

      The CEO can’t be held personally liable for the actions of the company, for example; their underlings could have been responsible and kept the leader in the dark.

      The onus should be on the company to prove their employees kept the CEO in the dark, not the other way around.

      source
      • voracitude@lemmy.world ⁨1⁩ ⁨day⁩ ago

        The law is a funny creature. I own a business myself (just started, actually!) and it would suck to be brought up on charges I have no idea about but I’m being held personally liable for. I’m grateful for the LLC protection in that case. Of course, I’m also not planning on committing any crimes, nor having my business commit crimes, so it’s a minor worry. Really only important in the event the law gets weaponised against the people, say for example by a foreign asset in high office… 😬

        source
        • -> View More Comments
    • bluGill@fedia.io ⁨2⁩ ⁨days⁩ ago

      That is a tricky question. IT isn't just does the CEO know, but should the CEO have known. If you make a machine that injures people the courts ask should you have expected that.

      The first time someone uses a lawnmower the cut a hedge the companies and gets hurt can say "we never expected someone to be that stupid" - but we now know people do such stupid things and so if you make a lawn mower and someone uses it to cut a hedge the courts will ask why you didn't stop them - the response is then we can't think of how to stop them but look at the warnings we put on.

      When Grok was first used to make porn X can get by with "we didn't think of that". However this is now known. They now need to do more to stop it. there are a number of options. Best is fix Grok so it can't do that; they could also just collect enough information on users that when it happens the police can arrest the person who instructed grok. There are a number of other options, if the court accepts them depends on if the tool is otherwise useful and if whatever they do reduces the amount of porn (or whatever evil) that gets through - perfection isn't needed but it needs to get close.

      source
  • BigMacHole@sopuli.xyz ⁨1⁩ ⁨day⁩ ago

    I LOVE how we’re giving this CHILD PORN CREATION TOOL BILLIONS of US Tax Payer Dollars while ALSO Spending US Tax Payer Dollars on PROTECTING JEFFREY EPSTEIN and ALSO Spending US Tax Payer Dollars on ARMED MEN KIDNAPPING CHILDREN TO PLACES WERE NOT ALLOWED TO SEE!

    source
    • pewgar_seemsimandroid@lemmy.blahaj.zone ⁨1⁩ ⁨day⁩ ago

      can we give a random teen the nuclear codes?

      source
      • JelleWho@lemmy.world ⁨1⁩ ⁨day⁩ ago

        000000 (if they still not change them at least)

        source
  • Baggie@lemmy.zip ⁨22⁩ ⁨hours⁩ ago

    Hey so question, how come we have to push so hard for a child porn machine to be called a child porn machine?

    source
  • johncandy1812@lemmy.ca ⁨1⁩ ⁨day⁩ ago

    Please stop using Musk’s products. He’s a Nazi.

    source
  • homesweethomeMrL@lemmy.world ⁨2⁩ ⁨days⁩ ago

    Image

    source
    • Jumbie@lemmy.zip ⁨1⁩ ⁨day⁩ ago

      It wasn’t just one salute. Fucker did it twice with gusto.

      Image

      source
    • Tyrq@lemmy.dbzer0.com ⁨1⁩ ⁨day⁩ ago

      If people are still on his platform after this and the grok thing, they ate stupid, evil, or both.

      source
      • Batmorous@lemmy.world ⁨6⁩ ⁨hours⁩ ago

        Or not very in tune with everything. I learned about all this couple months ago from shitty upbringing keeping me isolated for long time. Depends what kept people knowing about all this. There are some people I know that never heard of all this too because they just focus on living their life day by day

        source
        • -> View More Comments
      • floofloof@lemmy.ca ⁨1⁩ ⁨day⁩ ago

        It’s astonishing how many organizations are still using it for their official communications when there are ready alternatives.

        source
        • -> View More Comments
      • wonderingwanderer@sopuli.xyz ⁨1⁩ ⁨day⁩ ago

        Or they lost their password and the email associated with the account, and can’t log in to delete their account…

        source
        • -> View More Comments
  • phoenixz@lemmy.ca ⁨21⁩ ⁨hours⁩ ago

    Paid for by the US taxpayers who keep giving Elmo more money to not do projects, but do this shit instead

    source
  • LadyAutumn@lemmy.blahaj.zone ⁨1⁩ ⁨day⁩ ago

    This is a fucking nightmare. Twitter has effectively become a sexual exploitation generator.

    I have argued extensively with people on lemmy about why having AI porn generated of you without your consent is deeply traumatizing and a violation of your rights and privacy. Actual entire threads of people telling me that AI deep fake porn was perfectly fine and that we’re being too sensitive to not want our male peers and random strangers making AI deep fake porn of us. Wonder where those people are now.

    source
  • Yerbouti@sh.itjust.works ⁨1⁩ ⁨day⁩ ago

    So nazis and child porn huh? Twitter is the ultimate american conservatives wet dream.

    source
    • SpookyBogMonster@lemmy.ml ⁨1⁩ ⁨day⁩ ago

      shakes fist at cloud back in my day, that’s what 4chan was for!

      source
  • DarrinBrunner@lemmy.world ⁨2⁩ ⁨days⁩ ago

    See, though, there aren’t any consequences for illegal actions by the filthy rich. That’s why they’re better than the poors.

    source
  • glibg@lemmy.ca ⁨1⁩ ⁨day⁩ ago

    Why does it say that Grok is doing this? Wouldn’t it be more accurate to say that real people are using Grok as a tool to create this shit? Like place the blame where it belongs.

    source
    • PabloSexcrowbar@piefed.social ⁨7⁩ ⁨hours⁩ ago

      This is what I want to know. It doesn’t do anything unless someone tells it to. Why aren’t the people telling it to make child porn being held accountable? My 3D printer can make guns, but they won’t send the printer to jail if I decide to tell it to make one.

      source
    • Earthman_Jim@lemmy.zip ⁨1⁩ ⁨day⁩ ago

      The blame kinda rests with the creators of the tool as well as the content, no?

      source
    • fruitycoder@sh.itjust.works ⁨20⁩ ⁨hours⁩ ago

      “Twitter users use child porn machine to make millions of images of child porn.” Would be accurate imho. Legally though AI generated works are owned by no one because the machine made it but cant own it.

      source
  • BigMacHole@sopuli.xyz ⁨22⁩ ⁨hours⁩ ago

    The Pentagon used OUR Tax Dollars to BUY This tool! How COOL!

    -Democrats in Office!

    source
  • anon_8675309@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Elon is a disgusting human being.

    source
  • ChickenLadyLovesLife@lemmy.world ⁨1⁩ ⁨day⁩ ago

    I don’t watch PBS News Hour but my parents do and I have to listen to it from time to time. They characterized this issue as one of Grok creating “explicit” AI images and artificially generating pictures of real women (not “girls”) in “bathing suits”. Not exactly an accurate characterization of CSAM.

    source
    • prole@lemmy.blahaj.zone ⁨1⁩ ⁨day⁩ ago

      And that’s the “leftist propaganda” according to MAGA. PBS and NPR have so overcorrected, and in exchange have gotten nothing but defunded

      source
      • ChickenLadyLovesLife@lemmy.world ⁨1⁩ ⁨day⁩ ago

        In my opinion, the rightward correction has gotten even worse since the defunding. For a long time now they’ve run a graphic showing their corporate sponsors before each broadcast (Meta and oil companies often show up there). They love to say it’s “viewers like you” that make them possible, but I think the corporate sponsors are a lot more important. It’s been a very long time since the government funding has even been that big a chunk of their income.

        source
    • fluffykittycat@slrpnk.net ⁨1⁩ ⁨day⁩ ago

      it does it to pictures of real 12 year olds and nudes too, not just the nazi bikinis (but also that)

      source
  • pineapplelover@lemmy.dbzer0.com ⁨22⁩ ⁨hours⁩ ago

    Saving the children as usual

    source
  • billwashere@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Why is Grok not being charged with CSAM generation?

    source
    • Stabbitha@lemmy.world ⁨1⁩ ⁨day⁩ ago

      Have you seen who’s in charge of our government? They’re probably the ones making the prompts

      source
    • TheGrandNagus@lemmy.world ⁨1⁩ ⁨day⁩ ago

      Because the US seems to be supportive of it. They lashed out at governments who’ve said to resolve it or face site bans.

      source
  • drosophila@lemmy.blahaj.zone ⁨1⁩ ⁨day⁩ ago

    Imagine going back in time to 2015 and showing this article to someone.

    source
  • cley_faye@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Don’t worry, I’ve heard they limited this wonderful feature to paid accounts.

    source
  • kevin2107@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Isn’t it about 53k per violation?

    source
  • Gorilladrums@lemmy.world ⁨1⁩ ⁨day⁩ ago

    If even Grok shut down completely, it doesn’t mean anything. Pandora’s box is open and AI generated porn is here to stay. There are soooooooooooooooo many websites that exist just to generate deepfake nudes and AI porn. You take down one, another 100 pop up. It’s a futile game of whackamole.

    Even if we passed laws banning this shit, the technology that enables it to be a thing is free, open source, and can very easily be modified to do precisely this. Anybody can run these models locally at any time, and nobody can do a thing about it. Basically what I’m trying to say is that we’re cooked.

    source
    • fluffykittycat@slrpnk.net ⁨1⁩ ⁨day⁩ ago

      I mean, yeah, but it’s one thing if some perv’s running it on their own box after reading 5 guides vs. elon musk having a tiwtter bot that does it for you without even trying to stop it from doing that even after knowing it’s doing it. the former is unavoidable, the latter is a choice he’s made for some fucking reason

      source
      • Gorilladrums@lemmy.world ⁨21⁩ ⁨hours⁩ ago

        I mean agree that we should prevent it in such obvious cases like this, I was just making the point that this is going to be a very persistent issue.

        source
    • brooke592@sh.itjust.works ⁨21⁩ ⁨hours⁩ ago

      I agree.

      It’s fine if people want to get mad about it, but it’s more effective to just learn to live with it because it’s not going away.

      source
  • brooke592@sh.itjust.works ⁨21⁩ ⁨hours⁩ ago

    We care more about this than we do about school shootings.

    source
  • Tollana1234567@lemmy.today ⁨1⁩ ⁨day⁩ ago

    MUsk is strongly associated with “ms kung fu” lady, and epstein, so im not surprised.

    source
  • Nomorereddit@lemmy.today ⁨1⁩ ⁨day⁩ ago

    But I cant even get grok to make a picture of a 3 boobed Sydney Sweeney. Smh

    source
  • WanderingThoughts@europe.pub ⁨1⁩ ⁨day⁩ ago

    Time to remember it XXX.

    source
  • Chivera@lemmy.world ⁨1⁩ ⁨day⁩ ago

    Time to start creating and spreading nudes of all these billionaires wives.

    source
    • explodicle@sh.itjust.works ⁨1⁩ ⁨day⁩ ago

      Weren’t they already picking out their wives from public nudes?

      source
  • kali_fornication@lemmy.world ⁨2⁩ ⁨days⁩ ago

    it was just one guy who was like

    for num in range(3000000): if num < 23000: Grok.draw_sexualized_image(minor = True) continue Grok.draw_sexualized_image(minor = False)

    source
    • nutsack@lemmy.dbzer0.com ⁨1⁩ ⁨day⁩ ago

      I think this code could be made more efficient

      source
  • morto@piefed.social ⁨1⁩ ⁨day⁩ ago

    hope this expels the remaining too inertial people still using x

    source
  • D_C@sh.itjust.works ⁨1⁩ ⁨day⁩ ago

    Has anyone seen muskyboy lately? Do we know if he’s got a sore arm from constantly wanking off?

    source
  • SaharaMaleikuhm@feddit.org ⁨1⁩ ⁨day⁩ ago

    Where is Zensuraula when you need her?

    source
-> View More Comments