Mic_Check_One_Two@reddthat.com 1 year ago
This isn’t new. It’s been a known problem for a long time, because facial recognition software is trained using white people. So it gets really really good at differentiating between white people. But with black people as a tiny fraction of the sample data, it basically just learns to differentiate them with broad strokes. It’s good at telling them apart from white people, but not much else.
phx@lemmy.ca 1 year ago
It’s not just a training issue. Lighter (color) tones reflect. Dark tones absorb. There have been lots of issues with cameras or even just sensors having issues with people having dark skin tones because the lower reflectivity/contrast of dark tones. 3D scanners - even current models - have similar issues with objects having black parts for similar reasons. Training on more models can help, but there’s still an overall technical difficulty to overcome as well.
Shihali@sh.itjust.works 1 year ago
As a technological problem it could have a technological partial solution: the darker the skin, the higher the threshold to declare a match. This would also mean more false negatives (real matches not caught by the software) but not much to do about that.