It cannot
Comment on Gentoo Linux Begins Codeberg Migration In Moving Away From GitHub, Avoiding Copilot
Ladislawgrowlo@lemy.lol 10 hours agoreporting security issues
Is this not an advantage? If AI can find new security vulnerabilities reliably?
gwl@lemmy.blahaj.zone 8 hours ago
sp3ctr4l@lemmy.dbzer0.com 4 hours ago
Basically anywhere that LLMs are implemented… they are a security vulnerability, for any situation in which they are not sandboxed.
Anything they can interface with?
You can probably trick it or exploit it into doing something unintended or unexpected to anything else it is connected to.
Theoretically you could use an LLM to do something like come up with more accurate heuristics for identifying malware.
But… they’re nowhere near ‘intelligent’ enough to like, give it a whole code base for some kind of software, and thoroughly make that software 100% secure.
jjagaimo@sh.itjust.works 9 hours ago
It often makes up non existent vulnerabilities. I think it was curl getting flooded with fake vulnerability reports which drowns out real reports, esp because it can take time to parse through the code or run the poc
bananabread@lemmy.zip 9 hours ago
Or it could introduce new ones :)
eronth@lemmy.world 9 hours ago
Yeah, but you can have it scan without implementing.
JordanZ@lemmy.world 4 hours ago
I’ve had copilot suggest ‘fixing’ code to something that wasn’t even syntactically correct for the language and would break the build. If it can’t even figure out the super well documented syntax of a language I don’t trust it to find anything. The icing on the cake…it was a Microsoft language(C#).