On the other hand, if we move from larger and larger models with as much data they can gather to less generic and more specific high quality datasets, I have a feeling there’s still a lot to gain. But quality over quantity takes a lot more effort to maintain.
Comment on Has Generative AI Already Peaked? - Computerphile
Gormadt@lemmy.blahaj.zone 9 months ago
[deleted]
admin@lemmy.my-box.dev 9 months ago
magic_lobster_party@kbin.run 9 months ago
The video is more about the diminishing returns when it comes to increasing size of training set. It’s following a logarithmic curve. At some point, just “adding more data” won’t do much because the cost will be too high compared to the gain in accuracy.
EasternLettuce@lemm.ee 9 months ago
Gormadt@lemmy.blahaj.zone 9 months ago
mriormro@lemmy.world 9 months ago
Didn’t read your comment but you’re dumb