Actually, a lot of non LLM AI development, (and even LLMs, in a sense) is based very fundamentally on concepts of negative and positive reinforcement.
In such situations… pain and pleasure are essentially the scoring rubrics for a generated strategy, and fairly often, in group scenarios… something resembling mutual trust, concern for others, ‘empathy’ arises as a stable strategy, especially if agents can detect or are made aware of the pain or pleasure of other agents.
This really shouldn’t be surprising… as our own human empathy really fundamentally just is a biological sort of ‘answer’ to the same sort of ‘question.’
It is actually quite possible to base an AI more fundamentally off of a simulation of empathy, than a simulation of logic.
Unfortunately, the people in charge of throwing human money at LLM AI are all largely narcissistic sociopaths… so of course they chose to emulate themselves, not the basic human empathy that their lack.
Their wealth only exists and is maintained by their construction and refinement of elaborate systems of confusing, destroying, and misdirecting the broad empathy of normal humans.
CileTheSane@lemmy.ca 1 day ago
I don’t care if it’s genuine or not. Computers can definately mimic empathy and can be programmed to do so.
When you watch a movie you’re not watching people genuinely fight/struggle/fall in love, but it mimics it well enough.
lka1988@lemmy.dbzer0.com 1 day ago
Jesus fucking christ on a bike. You people are dense.
CileTheSane@lemmy.ca 15 hours ago
Oh, I get it now. You are incapable of empathy or a basic level of decency and are upset because you thought people would at least rather put up with you than a computer. If computers can mimic a basic level of human respect then what chance do you have?
lka1988@lemmy.dbzer0.com 14 hours ago
What the fuck is the jump to personal attacks?
This is the comment that started this chain, I made an equally tongue-in-cheek comment, and now I’m the one that gets piled on?
Fuck right off.