Was it really that hard to Photoshop enough to bypass mods that are not experts at photo forensic?
Comment on Well, it looks like verification photos might be useless now.
Hildegarde@lemmy.world 10 months agoThe point of verification photos is to ensure that nsfw subreddits are only posting with consent. Many posts were just random nudes someone found, in which the subject was not ok with having them posted.
The verification photos show an intention to upload to the sub. A former partner wanting to upload revenge porn would not have access to a verification photo. They often require the paper be crumpled to make it infeasible to photoshop.
If an AI can generate a photorealistic verification picture, it cannot be used to verify anything.
oce@jlai.lu 10 months ago
can@sh.itjust.works 10 months ago
Probably not, but it would still reduce the amount considerably.
AnneBonny@lemmy.dbzer0.com 10 months ago
I think it takes a considerable amount of work to photoshop something written on a sheet of paper that has been crumpled up and flattened back out.
Sheeple@lemmy.world 10 months ago
If you have experience with the program it’s piss easy
However most people do not have experience.
Serinus@lemmy.world 10 months ago
You also have to include the actual person holding something that can be substituted for the paper.
uriel238@lemmy.blahaj.zone 10 months ago
So you need a guy with such experience on your social engineering team.
psmgx@lemmy.world 10 months ago
It’s mostly about filtering the low-hanging fruit, aka the low effort trolls, repost bots, and random idiots posting revenge porn.
fine_sandy_bottom@discuss.tchncs.de 10 months ago
As in most things. I don’t have security cameras to capture video of someone breaking in. I have them so my neighbours house looks like an easier target.
xantoxis@lemmy.world 10 months ago
It would be easier to create verification photos with a particular expression this way. “Post a verification photo, frowning, with a blue pen in your teeth” for example, which might well be difficult to photoshop but no harder than anything else to aigen.
xor@sh.itjust.works 10 months ago
there’s a lot of tools to verify if something was photoshopped or not… you don’t need to be an expert to use them
oce@jlai.lu 10 months ago
I tried some one day and I didn’t find any that is actually easy for a noob, you have to check resolution, contrast, spatial frequency disruption, and nothing looked easy to detect without proper training.
xor@sh.itjust.works 10 months ago
i wouldn’t just go around telling people that…
trolololol@lemmy.world 10 months ago
Can you share more? Never had to use one.
bradorsomething@ttrpg.network 10 months ago
You can verify the resolution changes across a video or photo. This can be overcome by setting a dpi limit to your lowest resolution item in the picture, but most people go with what looks best instead of a flat level.
Reverendender@sh.itjust.works 10 months ago
On a side note, they are also used all the time for online selling and trading, as a means to verify that the seller is a real person who is in fact in possession of the object they wish to sell.
trolololol@lemmy.world 10 months ago
How does traditional - as in before AI - photo verification knows the image was not manipulated? In this post the paper is super flat, and I’ve seen many others.
Hildegarde@lemmy.world 10 months ago
From reading the verification rules from /r/gonewild they require the same paper card to be photographed from different angles while being bent slightly.
Photoshopping a card convincingly may be easy. Photoshopping a bent card held at different angles that reads as the same in every image is much more difficult.
stebo02@lemmy.dbzer0.com 10 months ago
That last thing will still be difficult with AI. You can generate one image that looks convincing, but generating multiple images that are consistent? I doubt it.
uriel238@lemmy.blahaj.zone 10 months ago
The paper is real. The person behind it is fake.
WalrusDragonOnABike@reddthat.com 10 months ago
Curious how long it’ll be until we start getting AI 3D models of this quality.
KneeTitts@lemmy.world 10 months ago
Jut ask for multiple photos of the person in the same place, AI has a hard time with temporal coherence so in each picture the room items will change, the face will change a bit (maybe a lot), hair styles will change… etc
RainfallSonata@lemmy.world 10 months ago
I didn’t realize the originated with verifying nsfw content. I’d only ever seen them in otherwise text-based contexts. It seemed to me the person in the photo didn’t necessarily represent the account owner just because they were holding up a piece of paper showing the username. But if you’re matching the verification against other photos, that makes more sense.
RedditWanderer@lemmy.world 10 months ago
It’s been used way before the nsfw stuff and the advent of AI.
Back in the days if you were doing an AMA with a celeb, the picture proof is the celeb telling us this is the account they are using. Doesn’t need to be their account and was only useful for people with an identifiable face. If you were doing an AMA because you were some specialist or professional, giving your face and username doesn’t do anything, you need to provide paperwork to the mods.
This is a poor way to police fake nudes though, I wouldn’t have trusted it even before AI.
uriel238@lemmy.blahaj.zone 10 months ago
It used to be tits or GTFO ON /b.
From now on I’ll have amazing tits.