Not after the initial training, no.
That would make it less effective, because instead of being trained on known real things it’s being further reinforced on its own hallucinations.
You wouldn't feed the images people generate and save back into the system to improve it?
Not after the initial training, no.
That would make it less effective, because instead of being trained on known real things it’s being further reinforced on its own hallucinations.
DoYouNot@lemmy.world 9 months ago
This actually doesn’t work to improve the model, generally. It’s not new information for it.