I don’t think so. Speaking as a parent.
Comment on Epstein Files: X Users Are Asking Grok to 'Unblur' Photos of Children
calcopiritus@lemmy.world 12 hours agoTbf it’s not needed. If it can draw children and it can draw nude adults, it can draw nude children.
Just like it doesn’t need to have trained on purple geese to draw one. It just needs to know how to draw purple things and how to draw geese.
Paranoidfactoid@lemmy.world 4 hours ago
calcopiritus@lemmy.world 3 hours ago
What you don’t think?
Why does being a parent give any authority in this conversation?
Paranoidfactoid@lemmy.world 1 hour ago
I have changed diapers and can attest to the anatomical differences between child and adult, and therefore know AI cannot extrapolate that difference without data clarifying these differences. Without that data trained in its model, AI would hallucinate something absurd or impossible.
slampisko@lemmy.world 12 hours ago
That’s not exactly true. I don’t know about today, but I remember about a year ago reading an article about an image generation model not being able, with many attempts, to generate a wine glass full to the brim, because all the wine glasses the model was trained on were half-filled.
calcopiritus@lemmy.world 12 hours ago
Did it have any full glasses of water? According to my theory, It has to have data for both “full” and “wine”
vala@lemmy.dbzer0.com 10 hours ago
Your theory is more or less incorrect. It can’t interpolate as broadly as you think it can.
calcopiritus@lemmy.world 4 hours ago
The wine thing could prove me wrong if someone could answer my question.
But I don’t think my theory is that wild. LLMs can interpolate, and that is a fact. You can ask it to make a bear with duck hands and it will do it. I’ve seen images on the internet of things similar to that generated by LLMs.
Who is to say interpolating nude children from regular children+nude adults is too wild?
Furthermore, you don’t need CSAM for photos of nude children.
Children are nude at beaches all the time, there probably are many photos on the internet where there are nude children in the background of beach photos. That would probably help the LLM.
WraithGear@lemmy.world 12 hours ago
that’s not true, a child and an adult are not the same. and ai can not do such things without the training data. it’s the full wine glass problem. and the only reason THAT example was fixed is because they literally trained it for that specific thing.
Jarix@lemmy.world 10 hours ago
I’m not saying it wasnt trained on csam or defending any AI.
But your point isn’t correct
What prompts you use and how you request changes can get same results. Clever prompts already circumvent many hard wired protections. It’s a game of whackamole and every new iteration of an AI will require different methods needed bypass those protections.
If you can ask it the right ways it will do whatever a prompt tells it to do
It doesn’t take actual images/data trained if you can just tell it how to get the results you want it to by using different language that it hasn’t been told not to accept.
The AI doesn’t know what it is doing, it’s simply running points through its system and outputting the results.
MathiasTCK@lemmy.world 8 hours ago
It still seems pretty random. So they’ll say they fixed it so it won’t do something, all they likely did was reduce probability, so we still get screenshots showing what it sometimes lets through.