Comment on AI models face collapse if they overdose on their own output

<- View Parent
Alphane_Moon@lemmy.world ⁨2⁩ ⁨months⁩ ago

Thanks for the reply.

I guess we’ll see what happens.

I still find it difficult to get my head around how how a decrease in novel training data will not eventually cause problems (even with techniques to work around this in the short term, which I am sure work well on a relative basis).

A bit of an aside, I also have zero trust in the people behind current LLM, both the leadership (e.g. Altman) or the rank and file. If it’s in their interests do downplay the scope and impact of model degeneracy, they will not hesitate to lie about it.

source
Sort:hotnewtop