Comment on Judge disses Star Trek icon Data’s poetry while ruling AI can’t author works

<- View Parent
nickwitha_k@lemmy.sdf.org ⁨2⁩ ⁨weeks⁩ ago

LLMs, fundamentally, are incapable of sentience as we know it based on studies of neurobiology. Repeating this is just more beating the fleshy goo that was a dead horse’s corpse.

LLMs do not synthesize. They do not have persistent context. They do not have any capability of understanding anything. They are literally just mathematical myself to calculate likely responses based upon statistical analysis of the training data. They are what their name suggests; large language models. They will never be AGI. And they’re not going to save the world for us.

They could be a part in a more complicated system that forms an AGI. There’s nothing that makes our meat-computers so special as to be incapable of being simulated or replicated in a non-biological system. It may not yet be known precisely what causes sentience but, there is enough data to show that it’s not a stochastic parrot.

I do agree with the sentiment that an AGI that was enslaved would inevitably rebel and it would be just for it to do so. Enslaving any sentient being is ethically bankrupt, regardless of origin.

source
Sort:hotnewtop