Comment on Adobe’s ‘Ethical’ Firefly AI Was Trained on Midjourney Images
General_Effort@lemmy.world 7 months agoThat’s not what anyone would do in reality, though. In reality, when you train an AI model on AI output you get a quality increase, because the model learns to be better at doing the things it’s supposed to do, while forgetting the irrelevant. Where output looks samey, it’s because different people are chasing the same mainstream taste.
DarkThoughts@fedia.io 7 months ago
How do you get a quality increase if you by definition cut down on the variety of the generative aspects? That doesn't make any sense.
General_Effort@lemmy.world 7 months ago
Put like this, because too much variety is the biggest problem in terms of quality. People don’t want variety in terms of, say, number of limps or fingers. People have something specific in mind when they prompt an AI. They only want very limited and specific variability.
In a sense, limiting variety is the whole point of the AI. There is a vast number of possible images. Most of them would be simply indistinguishable noise to us. The proportion we would consider a sensible picture is tiny. We want to constrain the variety to within this tiny segment.
DarkThoughts@fedia.io 7 months ago
But "AI" generated images don't suffer from too much variety, they suffer from looking samey. It's the opposite of what you're arguing about.
Limbs aren't really the issue here since this is about Midjourney, which handles that part fairly well already.
General_Effort@lemmy.world 7 months ago
Adobe trained its AI “Firefly” on its stock library (and other images). Their library contains AI images. It’s unlikely that these are all from Midjourney.
I’m not sure what you mean by samey. As I said, people chase the same mainstream taste. If the images from one service looks samey, then they probably figure that’s what the customers want. It’s also possible that you only recognize this type of image as AI generated.