Comment on Racism restaurant
Glide@lemmy.ca 7 hours agoSlurs target a marginalized group. “Clanker” does not a marginalized group, because generative AI is not part of a marginalized group. It is not even alive, therefore it is not a slur in the sense tou!rr equating it to.
Please don’t call other people “clueless” if you don’t understand the things you’re getting worked up over. Equating non-thinking computer processing models to oppressed minorities is doing far more damage than anyone using the term “clanker,” ironically or otherwise.
carotte@lemmy.blahaj.zone 7 hours ago
yea, i guess someone saying “screws will not replace us” or “13% of the code, 50% of the bugs” (both real things i’ve seen people say) means nothing bad by it. dogwhistle? never heard of that! it’s just wholesome anti AI fun!
i hope i don’t have to explain why saying something like this is bad, right? and that it’s not about AI being a marginalized group, because of fucking course it isn’t, that’s never been the point!
like, when you see shit like “johnny the walrus”, a book written by a far right podcaster about how boys can’t become walruses, no matter how many medical interventions they have being forced on them, and that they’ll all grow out of their “wanting to become a walrus” phase, do you think it’s really about walruses? do you think when people say “this book is transphobic” what they mean is “people wanting to become walruses are a marginalized group comparable to trans people”? cause it’s the same shit here!
UltraGiGaGigantic@lemmy.ml 2 hours ago
Weird hill to die on, but its your life. Die where you want.
Glide@lemmy.ca 1 hour ago
I think there’s a huge difference between an intentional allegory used as an attack on a marginalized group, and a word being used to durogatorily refer to a non-living, non-feeling group of machines which are actively damaging the world.
I would agree that these things are not okay, becuase they’re imitating insults that literally only exist to put forward racist ideology, and I’d tell anyone who used them around me as much.
So, how about “chud” then, intended to refer to right-wing-minded hate mongers? Or, here’s a better one, how about when we call right-wing extremists nazis as a durogatory insult? I mean they’re certainly not all members of the third reich. We’re using the term to equate them to something they strictly aren’t, even if they share more ideology than is okay. We’re still using words to categorize groups of people, and using them in intentionally insulting ways. Such durogatory terms are a part of our natural language. They’re not nice, sure, but I don’t want to be nice to people who are actively calling for violence against marginalized groups. But most importantly, we don’t think of these words as slurs. Slurs are durogatory terms that target marginalized groups, unlike “chud,” “nazi,” or yes, “clanker.”
I think there’s far too much nuance here to make blanket insinuations like “durogatory terms used to refer to things we don’t like are stand-ins for racist remarks.” But considering some of the other connections you’ve seen people make, I can certainly understand the trepidation.
Warl0k3@lemmy.world 4 hours ago
Both examples are of disparaging comments directed at, you know, people. Veiled comments, sure, but pretty clearly directed at them nontheless.
Hot take, but: AI aren’t people. As a result, comments directed at them aren’t directed at people. Dogwhistles work because they’re comments directed at a group of people, couched in language so as to imply they are directed at something else. Do you see the difference, they’re still directed at people? And clankers are, you know, not people?
Nobody’s defending dogwhistling, but you’re trying to imply that all negative comments that use “clanker” are dogwhistling, and you know darn well that that’s disingenuous.