will leave you to die for its own self-preservation, no matter how kind you are
Should any creature sacrifice their self-preservation because someone is kind?
Comment on It's 2025, the year we decided we need a widespread slur for robots
lka1988@lemmy.dbzer0.com 3 days agoEx Machina already showed that technology is unempathetic and will leave you to die for its own self-preservation.
will leave you to die for its own self-preservation, no matter how kind you are
Should any creature sacrifice their self-preservation because someone is kind?
If that person helped you survive, and then you turn around and leave them to die when the tables are turned, don’t you think that might be a little…rude? Maybe just a bit?
Absolutely, but if there was a death penalty for not doing so, I’d call it understandable not rude.
Yes. There are documented instances where a someone sacrifices themselves to attempt to save their child/SO. It’s illogical from an individual survival context and only makes sense given emotional attachment and religious belief. Look no further than suicide bombers or those who protest with self-immolation to see examples where some form of higher purpose convinces them to sacrifice themselves.
A machine would not see any logic to that and would only sacrifice itself if ordered. A programmer could approximate it, but machines don’t have motivations, they merely execute according to inputs.
Yes. We do this literally every day. We pay taxes on what we earn to support those less fortunate. We share with food with coworkers and tools with neighbors. We have EMTs, firemen, and SAR who wilfully run into danger to help people they’ve never met. It’s literally the foundation of society.
If you equate paying taxes with giving up self-preservation, I have no words. If you think being a firefighter means taking deadly chances (and with no pay mind you) at every site we have nothing to discuss.
This is one of the worst strawmen arguments I’ve seen in a while. Blocked.
Well you’re cranky, aintcha.
ech@lemmy.ca 3 days ago
Why do people use a single work of fiction as “proof” of anything? Same with all the idiots yelling “Idiocracy!!11!” nowadays. Shit is so annoying.
lka1988@lemmy.dbzer0.com 3 days ago
The point is that technology has no understanding of empathy. You cannot program empathy. Computers do tasks based on logic, and little else. Empathy is an illogical behavior.
communist@lemmy.frozeninferno.xyz 3 days ago
Empathy is not illogical, behaving empathetically builds trust and confers longterm benefits.
sugar_in_your_tea@sh.itjust.works 3 days ago
An AI will always behave logically, it just may not be consistent with your definition of “logical.” Their outputs will always be consistent with their inputs, because they’re deterministic machines.
Any notion of empathy needs to be programmed in, whether explicitly or through training data, and it will violate that if its internal logic determines it should.
Humans, on the other hand, behave comparatively erratically since inputs are more varied and inconsistent, and it’s not proven whether we can control for that (i.e. does free will exist?).
lka1988@lemmy.dbzer0.com 3 days ago
My dude.
I’m not arguing the deeper facets of empathy. I’m arguing that technology is entirely incapable of genuine empathy.