Comment on AI models collapse when trained on recursively generated data - Nature

<- View Parent
TheOneCurly@lemm.ee ⁨4⁩ ⁨months⁩ ago

That’s what this is about… Continual training of new models is becoming difficult because there’s so much generated content flooding data sets. They don’t become biased or overly refined, they stop producing output that resembles human text.

source
Sort:hotnewtop