Autonomous cars do not maintain the same level of safety and recognition for pedestrians in all cases. As a study by King’s College London (KCL) has concluded, these vehicles find it more difficult to recognize children and people with dark skin with their systems.
These recognition issues have caused a stir, as artificial intelligence would have potentially deadly age and race biases after systematically reviewing eight well-known pedestrian detection systems, all with similar flaws.
The researchers have tested the recognition programs with more than 8,000 images of pedestrians and found that the average detection accuracy was almost 20% higher in adults than in children. In addition, these systems were also 7.5% more accurate with light-skinned pedestrians than with dark-skinned ones.
“AI systems must be trained with a lot of data, and the inadequacies of that data are inevitably reflected in the AI. In this case, the open source image galleries used to train these pedestrian detection systems are not representative of all pedestrians, and are biased towards lighter-skinned adults.” Jie Zhang.
For this reason, Zhang believes that the way of preparing these systems must be reviewed: “Developers should start by being more transparent in terms of how their detection systems are trained so that they can be measured objectively.”
Furthermore, “they have to make sure that their AI systems are fair and representative, and part of the impetus for that will come from policy makers and stronger regulation around fairness in AI.”