Whenever you outsource something (like your intelligence) then it becomes a trust issue…
Claude Code deletes developers' production setup, including its database and snapshots — 2.5 years of records were nuked in an instant
Submitted 1 month ago by throws_lemy@lemmy.nz to technology@lemmy.world
Comments
Flying_Lynx@lemmy.ml 1 month ago
obelisk_complex@piefed.ca 1 month ago
artyom@piefed.social 1 month ago
You can code this into it’s training all you want, but it will find a way around it. This is one of many problems with AI.
thebestaquaman@lemmy.world 1 month ago
Nah, you can run it in a box and limit its ability to interact with anything outside the box to certain white-listed endpoints. Depending on what you want to achieve, that can be more than safe enough.
markz@suppo.fi 1 month ago
I thought this was about restricting the thing’s access and not training?
_stranger_@lemmy.world 1 month ago
You gotta be knowledgeable enough to know when they’re destructive, that’s the rub.
obelisk_complex@piefed.ca 1 month ago
Sure, but reading the article, I think he might be knowledgeable enough. His mistake seems to have been blindly trusting the keys to the kingdom to an enthusiastic junior dev who’ll be very sorry if they nuke your system, but won’t think to do a damn thing to make sure it doesn’t happen in the first place…
thespcicifcocean@lemmy.world 1 month ago
The only job AI is gonna take is the intern who fucks everything up.
psivchaz@reddthat.com 1 month ago
It legitimately is squeezing out the entry level already and that is its own problem. Maybe it’s good for some of us, in that people with experience will be needed for a long time as they prevent all these younger people from getting that experience, but it absolutely sucks for a whole bunch of people trying to make a career, and it will eventually suck for the economy as a whole. AI, whether it’s ready or not, or will even ever be fully what the marketing people claim it is, is leading to a whole lot of shortsighted decisions that are hurting people.
Rooster326@programming.dev 1 month ago
Do y’all still have Interns?
Who can even afford to not get paid to work in this economy.
wulrus@lemmy.world 1 month ago
This is a crazy use of AI!
What I have been considering, but haven’t found a readily available setup yet: Make a user with lots of read permissions (most of /etc, API keys & passwords in separate excluded files). That could be done with very restrictive sudo patterns. Let the AI run commands under that user directly (it can do sudo -l to get an idea of what it can do). Then, use it like in Star Trek “Computer - run a level 2 diagnostic”.
Not as the centre of attention when fixing a problem, but as additional input / modern rubber ducking.
paranoia@feddit.dk 1 month ago
I just run it in a VPS, don’t care if it nukes itself. Back up the files it works on every 3 hours.
webkitten@piefed.social 1 month ago
sigh
Use LLMs as instructional models not as production/development models. It’s not hard, people. You don’t need to connect credentials to any LLMs just like you’d never write your production passwords on post-it’s and stick them on your computer monitor.
artyom@piefed.social 1 month ago
Or don’t use LLMs at all, because they fucking lie to you constantly?
Semi_Hemi_Demigod@lemmy.world 1 month ago
“Lie” implies they have some kind of agency. They’re basically a Plinko board.
thebestaquaman@lemmy.world 1 month ago
Meh, they work well enough if you treat them as a rubber duck that responds. I’ve had an actual rubber duck on my desk for some years, but I’ve found LLM’s taking over its role lately.
I don’t use them to actually generate code. I use them as a place where I can write down my thoughts. When the LLM responds, it has likely “misunderstood” some aspect of my idea, and by reformulating myself and explaining how it works I can help myself think through what I’m doing. Previously I would argue with the rubber duck, but I have to admit that the LLM is actually slightly better for the same purpose.
vext01@feddit.uk 1 month ago
I dont understand why people aren’t sandboxing these things.
frongt@lemmy.zip 1 month ago
If he had had the sense to do that, he would have had the sense to not do it at all.
flamingo_pinyata@sopuli.xyz 1 month ago
How do you even achieve that? I have to coax it into correctly running the project locally.
motruck@lemmy.zip 1 month ago
Ever hear of a backup?
Kazel@lemmy.dbzer0.com 1 month ago
Nice ☺
Tigeroovy@lemmy.ca 1 month ago
Lmao good.
ramenshaman@lemmy.world 1 month ago
That’s what version control is for.
Valthorn@feddit.nu 1 month ago
What, is a requirement for Claude to work that you “sudo chmod -R 777 /” or something?
eestileib@lemmy.blahaj.zone 1 month ago
And nothing of value was lost
Jackhammer_Joe@lemmy.world 1 month ago
Good. Serves them right.
reksas@sopuli.xyz 1 month ago
i wonder which would be worse idea, letting llm to have full access to your critical systems and data, or letting random people from internet freely connect to them and expect them to help.
Naevermix@lemmy.world 1 month ago
skill issue tbh
HubertManne@piefed.social 1 month ago
Its funny because when I worked at places where I even had the rights to do something like this (exempting small companies where I was the multi hat guy but even there I made it so I had to go through hoops to do something like this) it makes me like. This is crazy.
FreshLight@sh.itjust.works 1 month ago
There’s just one thing to say:
LMAO
Liketearsinrain@lemmy.ml 1 month ago
Good.
root@lemmy.world 1 month ago
3-2-1
zod000@lemmy.dbzer0.com 1 month ago
womp womp
MagnificentSteiner@lemmy.zip 1 month ago
FAFO
Bieren@lemmy.today 1 month ago
Ai or not. This is on the person who gave it prod access. I don’t care if the dev was running CC in yolo mode, not paying attention to it or CC went completely rogue. Why would you give it prod access, this is human error.