I LOVE how we’re giving this CHILD PORN CREATION TOOL BILLIONS of US Tax Payer Dollars while ALSO Spending US Tax Payer Dollars on PROTECTING JEFFREY EPSTEIN and ALSO Spending US Tax Payer Dollars on ARMED MEN KIDNAPPING CHILDREN TO PLACES WERE NOT ALLOWED TO SEE!
Grok floods X with sexualized images of women and children: Grok generated an estimated 3 million sexualized images, including 23,000 of children in 11 days
Submitted 3 weeks ago by Beep@lemmus.org to technology@lemmy.world
https://counterhate.com/research/grok-floods-x-with-sexualized-images/
Comments
BigMacHole@sopuli.xyz 3 weeks ago
pewgar_seemsimandroid@lemmy.blahaj.zone 3 weeks ago
can we give a random teen the nuclear codes?
JelleWho@lemmy.world 3 weeks ago
000000 (if they still not change them at least)
homesweethomeMrL@lemmy.world 3 weeks ago
Jumbie@lemmy.zip 3 weeks ago
Tyrq@lemmy.dbzer0.com 3 weeks ago
If people are still on his platform after this and the grok thing, they ate stupid, evil, or both.
floofloof@lemmy.ca 3 weeks ago
It’s astonishing how many organizations are still using it for their official communications when there are ready alternatives.
wonderingwanderer@sopuli.xyz 3 weeks ago
Or they lost their password and the email associated with the account, and can’t log in to delete their account…
Batmorous@lemmy.world 2 weeks ago
Or not very in tune with everything. I learned about all this couple months ago from shitty upbringing keeping me isolated for long time. Depends what kept people knowing about all this. There are some people I know that never heard of all this too because they just focus on living their life day by day
LadyAutumn@lemmy.blahaj.zone 3 weeks ago
This is a fucking nightmare. Twitter has effectively become a sexual exploitation generator.
I have argued extensively with people on lemmy about why having AI porn generated of you without your consent is deeply traumatizing and a violation of your rights and privacy. Actual entire threads of people telling me that AI deep fake porn was perfectly fine and that we’re being too sensitive to not want our male peers and random strangers making AI deep fake porn of us. Wonder where those people are now.
johncandy1812@lemmy.ca 3 weeks ago
Please stop using Musk’s products. He’s a Nazi.
DarrinBrunner@lemmy.world 3 weeks ago
See, though, there aren’t any consequences for illegal actions by the filthy rich. That’s why they’re better than the poors.
Yerbouti@sh.itjust.works 3 weeks ago
So nazis and child porn huh? Twitter is the ultimate american conservatives wet dream.
SpookyBogMonster@lemmy.ml 3 weeks ago
shakes fist at cloud back in my day, that’s what 4chan was for!
glibg@lemmy.ca 3 weeks ago
Why does it say that Grok is doing this? Wouldn’t it be more accurate to say that real people are using Grok as a tool to create this shit? Like place the blame where it belongs.
Earthman_Jim@lemmy.zip 3 weeks ago
The blame kinda rests with the creators of the tool as well as the content, no?
PabloSexcrowbar@piefed.social 2 weeks ago
This is what I want to know. It doesn’t do anything unless someone tells it to. Why aren’t the people telling it to make child porn being held accountable? My 3D printer can make guns, but they won’t send the printer to jail if I decide to tell it to make one.
fruitycoder@sh.itjust.works 3 weeks ago
“Twitter users use child porn machine to make millions of images of child porn.” Would be accurate imho. Legally though AI generated works are owned by no one because the machine made it but cant own it.
anon_8675309@lemmy.world 3 weeks ago
Elon is a disgusting human being.
Baggie@lemmy.zip 3 weeks ago
Hey so question, how come we have to push so hard for a child porn machine to be called a child porn machine?
ChickenLadyLovesLife@lemmy.world 3 weeks ago
I don’t watch PBS News Hour but my parents do and I have to listen to it from time to time. They characterized this issue as one of Grok creating “explicit” AI images and artificially generating pictures of real women (not “girls”) in “bathing suits”. Not exactly an accurate characterization of CSAM.
prole@lemmy.blahaj.zone 3 weeks ago
And that’s the “leftist propaganda” according to MAGA. PBS and NPR have so overcorrected, and in exchange have gotten nothing but defunded
ChickenLadyLovesLife@lemmy.world 3 weeks ago
In my opinion, the rightward correction has gotten even worse since the defunding. For a long time now they’ve run a graphic showing their corporate sponsors before each broadcast (Meta and oil companies often show up there). They love to say it’s “viewers like you” that make them possible, but I think the corporate sponsors are a lot more important. It’s been a very long time since the government funding has even been that big a chunk of their income.
fluffykittycat@slrpnk.net 3 weeks ago
it does it to pictures of real 12 year olds and nudes too, not just the nazi bikinis (but also that)
phoenixz@lemmy.ca 3 weeks ago
Paid for by the US taxpayers who keep giving Elmo more money to not do projects, but do this shit instead
billwashere@lemmy.world 3 weeks ago
Why is Grok not being charged with CSAM generation?
Stabbitha@lemmy.world 3 weeks ago
Have you seen who’s in charge of our government? They’re probably the ones making the prompts
TheGrandNagus@lemmy.world 3 weeks ago
Because the US seems to be supportive of it. They lashed out at governments who’ve said to resolve it or face site bans.
JensSpahnpasta@feddit.org 2 weeks ago
Let’s be clear here: It is not Grok. Grok is a software developed by employees of Elon Musk that is capable of generating child porn and deep fakes of ordinary people. That software doesn’t have the saveguards to prevent this and was released to the public. Elon Musk, the whole leadership of X.com and their employees there were made aware that this is happening and did nothing for several days. So let’s not pretend that some “Grok” was doing it.
drosophila@lemmy.blahaj.zone 3 weeks ago
Imagine going back in time to 2015 and showing this article to someone.
cley_faye@lemmy.world 3 weeks ago
Don’t worry, I’ve heard they limited this wonderful feature to paid accounts.
BigMacHole@sopuli.xyz 3 weeks ago
The Pentagon used OUR Tax Dollars to BUY This tool! How COOL!
-Democrats in Office!
kevin2107@lemmy.world 3 weeks ago
Isn’t it about 53k per violation?
Nomorereddit@lemmy.today 3 weeks ago
But I cant even get grok to make a picture of a 3 boobed Sydney Sweeney. Smh
WanderingThoughts@europe.pub 3 weeks ago
Time to remember it XXX.
pineapplelover@lemmy.dbzer0.com 3 weeks ago
Saving the children as usual
Gorilladrums@lemmy.world 3 weeks ago
If even Grok shut down completely, it doesn’t mean anything. Pandora’s box is open and AI generated porn is here to stay. There are soooooooooooooooo many websites that exist just to generate deepfake nudes and AI porn. You take down one, another 100 pop up. It’s a futile game of whackamole.
Even if we passed laws banning this shit, the technology that enables it to be a thing is free, open source, and can very easily be modified to do precisely this. Anybody can run these models locally at any time, and nobody can do a thing about it. Basically what I’m trying to say is that we’re cooked.
fluffykittycat@slrpnk.net 3 weeks ago
I mean, yeah, but it’s one thing if some perv’s running it on their own box after reading 5 guides vs. elon musk having a tiwtter bot that does it for you without even trying to stop it from doing that even after knowing it’s doing it. the former is unavoidable, the latter is a choice he’s made for some fucking reason
Gorilladrums@lemmy.world 3 weeks ago
I mean agree that we should prevent it in such obvious cases like this, I was just making the point that this is going to be a very persistent issue.
brooke592@sh.itjust.works 3 weeks ago
I agree.
It’s fine if people want to get mad about it, but it’s more effective to just learn to live with it because it’s not going away.
kali_fornication@lemmy.world 3 weeks ago
it was just one guy who was like
for num in range(3000000):if num < 23000:Grok.draw_sexualized_image(minor = True)continueGrok.draw_sexualized_image(minor = False)nutsack@lemmy.dbzer0.com 3 weeks ago
I think this code could be made more efficient
Tollana1234567@lemmy.today 3 weeks ago
MUsk is strongly associated with “ms kung fu” lady, and epstein, so im not surprised.
Chivera@lemmy.world 3 weeks ago
Time to start creating and spreading nudes of all these billionaires wives.
explodicle@sh.itjust.works 3 weeks ago
Weren’t they already picking out their wives from public nudes?
morto@piefed.social 3 weeks ago
hope this expels the remaining too inertial people still using x
brooke592@sh.itjust.works 3 weeks ago
We care more about this than we do about school shootings.
Minimac@lemmy.ml 2 weeks ago
Grok must be outlawed, it’s truly sickening!
Sturgist@lemmy.ca 2 weeks ago
Grok didn’t make itself.
D_C@sh.itjust.works 3 weeks ago
Has anyone seen muskyboy lately? Do we know if he’s got a sore arm from constantly wanking off?
SaharaMaleikuhm@feddit.org 3 weeks ago
Where is Zensuraula when you need her?
voracitude@lemmy.world 3 weeks ago
Limited liability companies are “limited” in the sense that there are limited on the responsibilities of the members of the corporation. The CEO can’t be held personality liable for the actions of the company, for example; their underlings could have been responsible and kept the leader in the dark.
However, there’s this interesting legal standard wherein it is possible to “pierce the corporate veil” and hold corporate leadership accountable for illegal actions their company took, if you can show that by all reasonable standards they must or should have known about the illegal activity.
Anyway Elon has been elbow-deep in the inner workings of Xitter for years now, but his own annotation, right? Really getting in there to tinker and build new stuff, like Grok and its image generation tools. Seems like he knows an awful lot about how that works. An awful lot.
Zombiepirate@lemmy.world 3 weeks ago
Image
SatansMaggotyCumFart@piefed.world 3 weeks ago
Oh man the last three years have been rough on him.
Hopefully the next three are worse.
snooggums@piefed.world 3 weeks ago
The onus should be on the company to prove their employees kept the CEO in the dark, not the other way around.
voracitude@lemmy.world 3 weeks ago
The law is a funny creature. I own a business myself (just started, actually!) and it would suck to be brought up on charges I have no idea about but I’m being held personally liable for. I’m grateful for the LLC protection in that case. Of course, I’m also not planning on committing any crimes, nor having my business commit crimes, so it’s a minor worry. Really only important in the event the law gets weaponised against the people, say for example by a foreign asset in high office… 😬
bluGill@fedia.io 3 weeks ago
That is a tricky question. IT isn't just does the CEO know, but should the CEO have known. If you make a machine that injures people the courts ask should you have expected that.
The first time someone uses a lawnmower the cut a hedge the companies and gets hurt can say "we never expected someone to be that stupid" - but we now know people do such stupid things and so if you make a lawn mower and someone uses it to cut a hedge the courts will ask why you didn't stop them - the response is then we can't think of how to stop them but look at the warnings we put on.
When Grok was first used to make porn X can get by with "we didn't think of that". However this is now known. They now need to do more to stop it. there are a number of options. Best is fix Grok so it can't do that; they could also just collect enough information on users that when it happens the police can arrest the person who instructed grok. There are a number of other options, if the court accepts them depends on if the tool is otherwise useful and if whatever they do reduces the amount of porn (or whatever evil) that gets through - perfection isn't needed but it needs to get close.