Some self-driving car systems have trouble detecting darker skin, study says

Last year, Microsoft, IBM, and Amazon were called out for using facial recognition technology that was biased against people with dark skin. Well, it looks like self-driving cars could have the same problem.

An analysis from Georgia Tech researchers found that systems used by self-driving cars to detect pedestrians had trouble picking out people with darker skin tones.

Looking at footage from the Berkeley Driving Dataset, with video from New York, Berkeley, San Francisco, and San Jose, researchers were able to study how systems would react to different types of pedestrians.

They took eight image recognition systems commonly used in autonomous vehicles and evaluated how each picked up skin tone, as measured on the Fitzpatrick skin type scale. They found “uniformly poorer performance of these systems when detecting pedestrians with Fitzpatrick skin types between 4 and 6,” which are darker skin types.