If you know that it’s fancy autocomplete then why do you think it could “copy itself”?
It’s a stream of tokens. It doesn’t have access to the file systems it runs on, and certainly not its own compiled binaries (or even less source code) - it doesn’t have access to its weights either. (Of course it would hallucinate that it does if asked)
This is like worrying that the music coming from a player piano might copy itself to another piano.
forrgott@lemmy.sdf.org 8 months ago
Sorry, not LLM is never going to spontaneously gain the abilities self-replicate. This is completely beyond the scope of generative AI.
This whole hype around AI and LLMs is ridiculous, not to mention completely unjustified. The appearance of a vast leap forward in this field is an illusion. They’re just linking more and more processor cores together, until a glorified chatbot can be made to appear intelligent. But this is struggling actual research and innovation in the field, instead turning the market into a costly, and destructive, arms race.
The current algorithms will never “be good enough to copy themselves”. No matter what a conman like Altman says.
IAmNorRealTakeYourMeds@lemmy.world 8 months ago
It’s a computer program, give it access to a terminal and it can “cp” itself to anywhere in the filesystem or through a network.
“a program cannot copy itself” have you heard of a fork bomb? Or any computer virus?