Comment on Expecting a LLM to become conscious, is like expecting a painting to become alive
Sconrad122@lemmy.world 5 weeks agoAlternative way to phrase it, we don’t train humans to be ego-satiating brown nosers, we train them to be (often poor) judges of character. AI would be just as nice to David Duke as it is to you. Also, “they” is anthropomorphizing LLM AI much more than it deserves, it’s not even a single identity, let alone a set of multiple identities. It is a bundle of hallucinations, loosely tied together by suggestions and patterns taken from stolen data
Aeri@lemmy.world 5 weeks ago
Sometimes. I feel like LLM technology and it’s relationship with humans is a symptom of how poorly we treat each other.