If you’ve ever used it you can see how easily it can happen.
Yes, I can see how it can easily happen to stupid lazy people.
If you’ve ever used it you can see how easily it can happen.
At first you Sandbox box it and your careful. Then after a while the sand box is a bit of a pain so you just run it as is. Then it asks for permission a 1000 times to do something and at first you carefully check each command but after a while you just skim them and eventually, sure you can run ‘psql *’ to debug some query on the dev instance…
It’s one of the major problems with the “full self driving” stuff as well. It’s right often enough that eventually you get complacent or your attention drifts elsewhere.
This kind of stuff happened before the LLM coding agents existed, they have just supercharged the speed and as a result increased the amount of damage that can be done before it’s noticed.
There are already a bunch of failures in place for something like this to happen. Having the prod credentials available etc etc it’s just now instead of rolling the dice every couple weeks your LLM is rolling them every 20s.
If you’ve ever used it you can see how easily it can happen.
Yes, I can see how it can easily happen to stupid lazy people.
BorgDrone@feddit.nl 6 hours ago
How could this happen easily? A regular developer shouldn’t even have access to production outside of exceptional circumstances (e.g. diagnosing a production issue). Certainly not as part of the normal dev process.
tempest@lemmy.ca 3 hours ago
They shouldn’t and we know that but this is hardly the first time that story has been told even before LLMs. Usually it was blamed on “the intern” or whatever.
BorgDrone@feddit.nl 3 hours ago
This isn’t just an issue with a developer putting too much trust into an LLM though. This is a failure at the organizational level. So many things have to be wrong for this to happen.
If an ‘intern’ can access a production database then you have some serious problems. No one should have access to that in normal operations.
tempest@lemmy.ca 2 hours ago
Sure, I’m not telling you how it should be, I’m telling you how it is.
The LLM just increases the damage done because it can do more damage faster before someone figures out they fucked up.
This is the last big one I remembered offhand but I know it happens a couple times a year and probably more just goes unreported.
www.cnn.com/2021/…/solarwinds123-password-intern