You don’t (normally) have to train people to be afraid of bears or heights or loneliness or boredom. You also don’t (normally) have to train people to have empathy or compassion.
So what are you implying about people who don’t experience these?
It could also be that it lacks the machinery to feel any emotions at all. You don’t (normally) have to train people to be afraid of bears or heights or loneliness or boredom. You also don’t (normally) have to train people to have empathy or compassion.
I argue that our obsession with AI is, itself, a misalignment with our environment; it disproportionately tickles psychological reward centers which evolved under unrecognizably different circumstances.
You don’t (normally) have to train people to be afraid of bears or heights or loneliness or boredom. You also don’t (normally) have to train people to have empathy or compassion.
So what are you implying about people who don’t experience these?
What am I implying? That their machinery is abnormal and they likely need assistance to live normal, healthy lives. That’s literally why the fields of psychiatry and psychology exist: healthy people don’t need doctors and therapists. Do you disagree?
Havoc8154@mander.xyz 1 week ago
I guess you don’t have children.
You absolutely do have to train them to be afraid of bears, heights, and every fucking thing you can imagine. You absolutely do have to teach them empathy and compassion. There may be some nugget of instinct, but without reinforcement it might as well not exist.
ExFed@programming.dev 1 week ago
Hah, okay, you got me there. From my understanding, though, that’s mostly because kids are still figuring out what’s “normal”, so their fear instinct isn’t nearly as strong. I guess I should’ve stuck to the more instinctive sources of fear…
Regardless, that’s not really my point. My point is an LLM doesn’t rely on machinery in the same way that a human brain does. That doesn’t make AI “worse” or “better” overall, but it does make it an awful replacement for other humans.