JustAPenguin
@JustAPenguin@lemmy.world
This is a remote user, information on this page may be incomplete. View at Source ↗
- Comment on Elon Musk wants to rewrite "the entire corpus of human knowledge" with Grok 4 days ago:
The thing that annoys me most is that there have been studies done on LLMs where, when trained on subsets of output, it produces increasingly noisier output.
Sources (unordered):
- What is model collapse?
- AI models collapse when trained on recursively generated data
- Large Language Models Suffer From Their Own Output: An Analysis of the Self-Consuming Training Loop
- Collapse of Self-trained Language Models
Whatever nonsense Muskrat is spewing, it is factually incorrect. He won’t be able to successfully retrain any model on generated content. At least, not an LLM if he wants a successful product. If anything, he will be producing a model that is heavily trained on censored datasets.
- Comment on Microsoft fires employee protestor who called AI boss a ‘war profiteer’ 2 months ago:
I’ve been trying to reach you about your car’s extended warranty
- Comment on Save big money, but not your soul. 7 months ago: