the AI considered
Sorry to break it to you, but the "AI" does not "consider" anything. They are talking about a language prediction model.
Comment on Study Finds LLMs Biased Against Men in Hiring
grober_Unfug@discuss.tchncs.de 4 weeks ago
the AI considered
Sorry to break it to you, but the "AI" does not "consider" anything. They are talking about a language prediction model.
the problematic part of this is that you’ve stripped all context to support your, admittedly bigoted, rhetoric and ethos.
black people, generally, have worse education outcomes than whites in american education. you’d still be an incredibly shitty and terrible person if you advocated hiring white people over black people by wrote rule. you can find plenty of “studies” that formalize that argument just as you have here, though.
no, i think most rational people understand that in a scenario like this all people have, on average, the same basic cognitive faculties and potential, and would then proceed to advocate for improving the educational conditions for groups that are falling behind not due to their own nature, but due to the system they are in.
but idk, i’m not a bigot so maybe my brain just implicitly rejects the idea “X people are worse/less intelligent/etc than Y people”
fucking think about what you’re saying. there is no “right people” to hate other than the rich and powerful. it isn’t a subversion of the feminist message to admit this. in fact, it makes you a better feminist. real feminist aren’t sexist.
This isn’t exactly a comprehensive literature review, and totally misunderstands what a LLM is and does
Right. If it's true that women statistically outperform men (with same application documents), it'd be logical to prefer them just on gender alone. Because they likely turn out to be better.
For most jobs it's hard to do a hiring process without in-person interviews, or at the very least video calls. So I'm not really sure how one could realistically get rid of biases. But I completely agree that whenever there are too many applications to interview everyone individually, the initial screening of applicants should be completely anonymized and rely only only technologies where biases can at least be understood.
For the final step I'm afraid we'll have to try to train people to be less prone to biased decision-making. Which I agree is not a very promising path.
You're welcome. I mean it's kind of a factual question. Is gender an indicator on its own? If yes, then the rest is just how statistics and probability work... And that's not really a controversy. Maths in itself works 🥹
I'd also welcome if we were to cut down on unrelated stuff, stereotypes and biases. Just pick what you like to optimize for and then do that. At least if you believe in the free market in that way. Of course it also has an impact on society, people etc and all of that is just complex. And then women and men aren't really different, but at the same time they are. And statistics is more or less a tool. Highly depends on what you do with it and how you apply it. It's like that with most tools.
catty@lemmy.world 4 weeks ago
technocrit@lemmy.dbzer0.com 4 weeks ago
Handpicked poor study… What do you think this whole post is?
catty@lemmy.world 4 weeks ago
Allow me to do what feminists do - including in this very thread:
“Women can’t take it”
cabbage@piefed.social 4 weeks ago
At least where I'm from it's pretty well known that girls outperform boys in school, probably because their brains develop slightly faster in some ways useful to perform in a class room.
This could give women a head start and very well lead to them on average performing better in work life, until they are forced to choose between careers and families while they partners continue to advance their careers at full speed not worrying about being pregnant.
But that's a different discussion. We should avoid biases in hiring because biases suck and make for an unjust society. And we should stop pretending language models make intelligent considerations about anything.
What's fascinating here is that LLMs trained on the texts we produce create the opposite bias of what we observe in society, where men tend to get preferential treatment. My guess is that this is a consequence of inclusive language. In my writibg, whenever women are under-represented, I make a point out of defaulting to she and her rather than her and him. I know others do the same. I imagine this could feed into LLMs. Whatever it is that causes this, it sure as fuck isn't anything actually intelligent.
catty@lemmy.world 4 weeks ago
At least where I’m from, it’s pretty well know that the education system is better suited to girls than boys, probably because it needs a reform.
To paraphrase: women can get pregnant and can’t work. It’s the man’s fault. It isn’t the extreme and aggressive pandering to feminism that gives women a “head start”, but because they’re better educated.
So you’re stating that LLMs are making dumb decisions by recommending women over men. Lol
cabbage@piefed.social 4 weeks ago
I didn't say it doesn't, clearly there's a problem when half the population is systematically favoured.
Where the fuck did I say that it's the man's fault? It's a societal problem, doesn't mean it's anybody's fault.
I'm the first to criticize corporate feminism (just like greenwashing and pride washing), but I suspect feminist messaging appeals to women because they are sick of the patriarchy, not because they are programmed by marketing agencies. The fuck are you on about.