Study: Self-Driving Cars Have Trouble Detecting People of Color

The automated decision-making processes of self-driving cars are ill equipped to protect that safety of people of color in public.
March 7, 2019, 6am PST | James Brasuell | @CasualBrasuell
Share Tweet LinkedIn Email Comments

"If you’re a person with dark skin, you may be more likely than your white friends to get hit by a self-driving car," reports Sigal Samuel.

That's the finding of a new study [pdf] by researchers at the Georgia Institute of Technology report that analyzed autonomous vehicles footage from New York City, San Francisco, Berkeley, and San Jose. Samuel explains the study in more detail:

The authors of the study started out with a simple question: How accurately do state-of-the-art object-detection models, like those used by self-driving cars, detect people from different demographic groups? To find out, they looked at a large dataset of images that contain pedestrians. They divided up the people using the Fitzpatrick scale, a system for classifying human skin tones from light to dark.

The researchers then analyzed how often the models correctly detected the presence of people in the light-skinned group versus how often they got it right with people in the dark-skinned group.

The results of the study are concerning, to say the least.

Detection was five percentage points less accurate, on average, for the dark-skinned group. That disparity persisted even when researchers controlled for variables like the time of day in images or the occasionally obstructed view of pedestrians.

Samuel ties the problem to the record of algorithmic bias—human bias influencing the results of automated decision-making systems. The human failure behind the machine's failure also implies potential solutions to the problem.

Share Tweet LinkedIn Email