SaucySnake
@SaucySnake@lemmy.world
This is a remote user, information on this page may be incomplete. View at Source ↗
- Comment on Welp ... 4 months ago:
arxiv.org/abs/2306.07899 here’s a paper that found that one of the biggest sources for LLM training data is corrupted by people using AI to complete the tasks. There are plenty of papers out there that show the effects of this, which they call “model collapse”.