Comment on Oncoliruses: LLM Viruses are the future and will be a pest, say good bye to decent tech.
davidgro@lemmy.world 3 days agoWhy would someone direct the output of an LLM to a terminal on its own machine like that? That just sounds like an invitation to an ordinary disaster with all the ‘rm -rf’ content on the Internet (aka training data). That still wouldn’t be access on a second machine though, and also even if it could make a copy, it would be an exact copy, or an incomplete (broken) copy. There’s no reasonable way it could ‘mutate’ and still work using terminal commands.
And to be a meme requires minds. There were no humans or other minds in my analogy. Nor in your question.
IAmNorRealTakeYourMeds@lemmy.world 3 days ago
It is so funny that you are all like “that would never work, because there are no such things as vulnerabilities on any system”
Why would I? the whole point is to create a LLM virus, and if the model is good enough, then it is not that hard to create.
davidgro@lemmy.world 3 days ago
Of course vulnerabilities exist. And creating a major one like this for an LLM would likely lead to it destroying things like a toddler (in fact this has already happened to a company run by idiots)
But what it didn’t do was copy-with-changes as would be required to ‘evolve’ like a virus. Because training these models requires intense resources and isn’t just a terminal command.
IAmNorRealTakeYourMeds@lemmy.world 3 days ago
Who said they need to retrain? A small modification to their weights in each copy is enough. That’s basically training with extra steps.