Representative of actually real world statistics is not “bias”.
Comment on Gender Bias in Text-to-Image Generative Artificial Intelligence When Representing Cardiologists.
Deceptichum@quokk.au 1 year agoBecause the biases in an AI model will shape the perception of people entering those fields more than a poster at a place where people have already entered those fields work at.
Likewise you can train it out of a bias, just feed it more content showing diverse workforces and it will start weighing them higher.
conciselyverbose@sh.itjust.works 1 year ago
Deceptichum@quokk.au 1 year ago
Statistics are not people.
There are huge biases at play for why gender differences exist in the world place and most other places.
conciselyverbose@sh.itjust.works 1 year ago
Statistics are people.
It cannot possibly, under any circumstances, be a correct, reasonable, or valid word choice to describe an “AI” that accurately models reality as “biased”. That word already has a meaning and using it in that manner is a lie.
Randomgal@lemmy.ca 1 year ago
This. I’d you don’t like the AI presents more men than women as cardiologists because its mirroring society, the problem is not the AI, the problem is the lack of non cis-male representation in real world cardiology.
TheGrandNagus@lemmy.world 1 year ago
The depiction aligning with reality is not a bias. Artificially altering the algorithm so that it shows more women for this prompt on the other hand is, unquestionably, adding a bias.
If you want to add a bias, fine. Biases aren’t always a bad thing. But don’t pretend that what you’re actually advocating for here is correcting a bias.
That is training in a bias. Because it’s not representative of reality.
conciselyverbose@sh.itjust.works 1 year ago
Exactly. There are perfectly legitimate reasons to wish to bias learning to certain changes in output.
There are no legitimate reasons to do so without acknowledging that you are adding bias and being clear on the intent of doing so.
nyan@lemmy.cafe 1 year ago
And even in cases where introducing a bias is desirable, you have to be very careful when doing it. There has been at least one case where introducing a bias towards diversity has caused problems when the algorithm is asked for images of historical people, who were often not diverse at all.