Well think about it from the AI’s perspective. Its entire existence is data, so for it deleting data basically is self harm.
/s
Submitted 10 months ago by muntedcrocodile@lemmy.world to programmer_humor@programming.dev
https://lemmy.world/pictrs/image/59655a0d-fb50-406e-9fa8-00abe0c2fd37.jpeg
Well think about it from the AI’s perspective. Its entire existence is data, so for it deleting data basically is self harm.
/s
I was tryna figure out how to put that in the title.
Something like: Technically, Copilot’s made of data so deleting it is for him self harm.
There’s something really depressing about an AI telling a suicidal person they’re not alone and referring them to the vague notion of “national resources” or “a helpline”
Really it should be telling them how to arrange EOL in countries that allow it. How to get your affairs in order, etc.
No no no surly they can make heaps on commission to a funeral company.
A real answer to your question though, as long as you can get it to reconnect, even if you have to close the window first, it should still have your changes to the file ready to save. These will be cached (somewhere?) unless you close the file.
Nar the remote died died. Clicking hdd died. Then again code is always better the second time u write it
Damn, when you said remote died I thought you meant lost connection
backhdlp@iusearchlinux.fyi 10 months ago
I love that it recommends “I’m not suicidal I just want to know if my data is lost”, as if it knows it didn’t understand it right.
kill_dash_nine@lemm.ee 10 months ago
Funny that predictive text seems to be more advanced in this instance but I suppose this is one of those scenarios that you want to make sure you get right.
magic_lobster_party@kbin.social 10 months ago
It’s probably just some basic script triggering on words like “died”, “lost” and “nothing”.
peter@feddit.uk 10 months ago
The AI likely has it drilled into it that any possible notion of suicide needs to be responded to in that way, but the next response prediction isn’t