The pedestrian detection systems in self-driving cars are less likely to detect children and people of color, study suggests::Researchers say the results stem from biases present in open-source AI, on which self-driving cars rely.
This is the best summary I could come up with:
As the artificial intelligence revolution ramps up, one trend is clear: Bias in the training of AI systems is resulting in real-world discriminatory practices.
A team of researchers in the UK and China tested how well eight popular pedestrian detectors worked depending on a person’s race, gender, and age.
Now they might face severe injury," Jie Zhang, a computer scientist at King’s College London and a member of the research team, said in a statement.
This trend is a result of biases already present in the open-source AI systems that many companies use to build the detectors.
The research team called on lawmakers to regulate self-driving car software to prevent bias in their detection systems.
“It is essential for policymakers to enact laws and regulations that safeguard the rights of all individuals and address these concerns appropriately,” the study reads.
The original article contains 376 words, the summary contains 140 words. Saved 63%. I’m a bot and I’m open source!
Pancakes@lemmy.world 1 year ago