Comment on AI models face collapse if they overdose on their own output

Warl0k3@lemmy.world ⁨2⁩ ⁨months⁩ ago

Wow, this is a peak bad science reporting headline. I hate to be the one to break the news but no, this is deeply misleading. We want AI to hit it’s downfall, but these issues with recursive training data or training on small datasets have been near enough solved for 5+ years now. The nature paper is interesting because it explains the modality of how specific kinds of recursion impact several model types, this doesn’t mean AI is going to get back in pandoras box. The opposite, in fact, since this will let us design even more robust systems.

source
Sort:hotnewtop