fbpx
Tuesday, 19 March 2019 18:48

Autonomous Cars Can't Recognize Pedestrians With Darker Skin Tones

Written by Jessica Miley, Interesting Engineering
Autonomous Cars Can't Recognize Pedestrians With Darker Skin Tones Wikimedia Commons

Index

 

Racial Profiling Is Lethal

 

The system is used by judges in criminal sentencing; it provides a score based on whether the person is likely to reoffend. A high score suggests they will reoffend; a low score suggest it is less likely.

 

The investigative journalists assessed the risk score assigned to more than 7,000 people in Broward County in Florida in 2013 and 2014 and then watched to see if the same people were charged with any new crimes in the next two years.

 

The algorithm not only proved to be unreliable---only 20 percent of the people predicted to commit violent crimes did so---but also racially biased.

 

Black defendants were more likely to be flagged as future criminals, wrongly labeling them at almost twice the rate of white defendants, while white defendants were mislabeled as a low risk more often than black defendants.

 

The AI development community must come together and take a public stand against this sort of massively damaging bias.

 

We thank Interesting Engineering for reprint permission.


« Previous Page Continue reading

Read 1064 times