Last year, Microsoft, IBM, and Amazon were called out for using facial recognition technology that was biased against people with dark skin. Well, it looks like self-driving cars could have the same problem.

An analysis from Georgia Tech researchers found that systems used by self-driving cars to detect pedestrians had trouble picking out people with darker skin tones.

Looking at footage from the Berkeley Driving Dataset, with video from New York, Berkeley, San Francisco, and San Jose, researchers were able to study how systems would react to different types of pedestrians.

They took eight image recognition systems commonly used in autonomous vehicles and evaluated how each picked up skin tone, as measured on the Fitzpatrick skin type scale. They found "uniformly poorer performance of these systems when detecting pedestrians with Fitzpatrick skin types between 4 and 6," which are darker skin types.

There are several factors that could lead to inaccurate results, like time of day or clothing color. But they found that solely based on skin color, accuracy dropped an average of 5 percent for pedestrians with darker skin. If a system doesn't identify a person as a pedestrian, they're more at risk of being hit because the computer doesn't know to predict their behavior.

Many autonomous cars use a mix of LiDAR, radar, other sensors, and cameras. A few autonomous vehicle companies rely heavily on cameras, like Tesla's semi-autonomous Autopilot system. Silicon Valley-based company Ambarella is developing a self-driving system that relies almost entirely on cameras.

Not all companies use cameras, though. Blackmore is focused on Doppler LiDAR, so clothing choices and skin tone don't matter. Instead, it measures the velocity of objects, concentrating on things that are moving, instead of stationary objects like trees and mailboxes.