The degree of randomness in generative models is not necessarily fixed, it can at least potentially be tunable. I’ve built special-purpose generative models that work that way (not LLMs, another application). More entropy in the model can increase the likelihood of excursions from the mean and surprising outcomes, though at greater risk of overall error.
There’s a broader debate to be had about how much that has to do with creativity, but if you think divergence from the mean is part of it, that’s within LLM capabilities.
dukemirage@lemmy.world 2 days ago
Well, every academic field needs creativity. But it’s nothing new that people from economic or tech bubbles have a disdain for humanities.