French immigrants are eating our pets!
“We investigated ourselves and found nothing wrong.”
Submitted 1 week ago by vsis@feddit.cl to [deleted]
https://feddit.cl/pictrs/image/7c97195b-8dba-4958-9813-590ed33dee63.png
French immigrants are eating our pets!
“We investigated ourselves and found nothing wrong.”
Back2thefuture_iknowthisone.png
That’s a song by the police, right?
Iseewhatyoudidthere.jpg
LOL
…you know people made fake picture before image generation, right?
They made fake pictures before computers existed too.
I’ve seen the cave paintings deer and horses everywhere but when I look around nothing but rocks and trees.
No spooky eyes, no extra limbs, no eery smile? - 100% real, genuine photograph! 👍
I have two of those cats. I still can’t catch them when its time to go to bed.
Such a cute kitty snail! Can you post just the picture?
Its an “AI generated image at snail” :3
cute snat…
That’s clearly a cail
Are we all looking at the same snussy?
We get them a lot around here. They don’t make for good pets, but they keep the borogoves at bay.
Which is great, honestly. Borogoves themselves are fine, but it’s not worth the risk letting them get all mimsy.
Ignorant Americans, never even heard of the common snailcat
Where the fuck are you from that they aren’t called catsnails? Odd. Been catsnails here since I can remember.
yeah just shopped
Chat is the picture real?
It’s a real valid picture in png format.
HAS SCIENCE GONE TO FAR?!
Miao
The AI was trained on a combination of cat videos and sponge bob
Well duh it detects AI generated images that are at scale and that snail cat is way too small for it
It has been 0
days since classified military gene research has been leaked by interrogating ai detecting models
Honestly, they should fight fire with fire? Another vision model (like Qwen VL) would catch this
You can ask it “does this image seem fake?” and it would look at it, reason something out and probably conclude it isn’t, instead of… I dunno, looking for smaller patters or whatever their internal model does?
Clearly AI is on the verge of taking over the world.
What tool is this? I assume the tool is also AI based?
So only 2% "not likely to be AI-generated or deepfake"... that means that it's almost definitely AI, got it!? :-P
So the prophecy is fulfilled
Isn’t there a whole thing about if you average out colors on AI generated photos you get a uniform beige color?
I don’t get why these tools don’t just do that but I guess you got to keep the marketing up of using AI to find a solution.
Either that’s not true of AI images or it’s true of all images. There aren’t answers that simple to this. Pixels are pixels.
It is absolutely not true of all AI images. I’d be surprised if it’s even true about most AI images.
pennomi@lemmy.world 1 week ago
I mean, it could be a manual photoshop job. Just because it’s not AI doesn’t mean it’s real.
But also the detector is probably wrong - it’s likely an AI image using a different model than the detector was trained to detect.
db2@lemmy.world 1 week ago
There were a lot of really good images like that well before AI. Anyone remember Photoshop Friday?
Paradachshund@lemmy.today 1 week ago
There’s a sort of… Sheen, to a lot AI images. Obviously you can prompt this away if you know what you’re doing, but its developing a bit of a look to my eye when people don’t do that.
Sparky@lemmy.blahaj.zone 1 week ago
Can we bring that back?
arken@lemmy.world 1 week ago
It could, but the double spiral in the shell indicates AI to me. Snail shells don’t grow like that. If it was a manual job, someone would have used a picture of a real shell.
pennomi@lemmy.world 1 week ago
Agreed. The aggressive depth of field is another smoking gun that usually indicates an AI image.
rickyrigatoni@lemm.ee 1 week ago
Snail shells don’t grow like that but this is clearly a snat, not a snail.
fishbone@lemmy.dbzer0.com 1 week ago
Also the fact that the grain on the side of the shell is perpendicular to the grain on the top, and it changes where the cat ear comes up in front of it.
Very telltale sign of AI is a change of pattern in something when a foreground object splits it.
Not saying it’s always a guarantee, but it’s a common quirk and it’s pretty easy to identify.