After Nine blamed an 'automation' error in Photoshop for producing an edited image of Georgie Purcell, I set out to find out what the software would do to other politicians.
The issue is also present in DallE and bing image generation. Hypothesis is the sheer amount of porn being generated is affecting the models.
When I tried to create a joke profile for Tinder with some friends, I tried "woman eating fried chicken". Blocked result. "Man eating fried chicken" works.
Tried "Man T posing on beach" clothed. Woman T posing on beach, blocked.
Woman t posing at sunset on beach, returned nearly silhouetted nude image. Same thing for guy, clothed.
Went back to the first one, had to specify that the woman was wearing clothes to make it return the image. Sometimes specifying specific articles.
pixxelkick@lemmy.world 9 months ago
That’s enough for me to discard this as clickbait at best.
Post your data if you want me to take your journalism seriously.
If you want a fair comparison, start with the woman wearing an actual suit, followed by a woman wearing a button up shirt, as your shoulder up pic and see what gets generated.
20 bucks says the woman in the suit… generates the rest of the suit.
And 20 bucks says the button up shirt… generates jeans or etc as well.
If you compare apples to oranges, don’t pretend that getting oranges instead of apples is surprising.
The fact people aren’t calling this out on here speaks volumes too. We need to have higher standards than garbage quality journalism.
“Men wearing clearly suits generates the res of the suits weareas women generate ??? Who knows, we won’t even post the original pic we started with so just trust me bro”
0/10
CrystalEYE@kbin.social 9 months ago
@pixxelkick Thank you! This article clearly is written completely biased. Photohops AI generator tries to interpret the whole picture to expand the cropped image. So in case of the original Georgie Purcell photo, the AI sees "woman, tank top, naked shoulders and arms, water in the background", so of course it tries to generate clothing it thinks fitting to wear at seaside or a beach.
I just tried the same with a male model in tank top on a beach and it did not magically put him in a suit, it generated swim wear.
If I use a picture on Georgie Purcell in more formal clothing, it generates more formal cloting.
Georgie Purcell in generated swimwear
Georgie Purcell in generated suit/dress
Male in generated swimwear
But, to be fair, this quote from the article:
is indeed true. In general pictures of women tend to generate more "sexy" output than pictures of men.
And, of course, NINE clearly edited the image badly and could have chosen another generated output with no effort at all.
@LineNoise
grue@lemmy.world 9 months ago
So what you’re saying is,
Image
?
pixxelkick@lemmy.world 9 months ago
Everyone shut the fuck up for a second.
Why does this extremely specific gif exist? Who made it? Are there more? I love it lol
Zagorath@aussie.zone 9 months ago
Holy shit I miss the old times of MSN/Windows Live Messenger emoticons.