Facial Recognition Technology
This is the first of a four-part series of panel discussions on how Technology impacts Social Justice This virtual hour-long discussion will explore Facial Recognition Technology and Law Enforcement's use of the flawed data it provides.
More detail about the topic - including links to articles and to a video - are below. We will discuss the situation as it exists today and what can be done to remove bias from these systems. Please register for this free online event and join us on September 11 for this most interesting and timely discussion.
More Information and Links
The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces, said the National Institute of Standards and Technology. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found.
While the Chicago Police Department dropped usage of facial recognition software, many states have not followed. This is concerning for people of color. Three commercially released facial-analysis programs from major technology companies demonstrate both skin-type and gender biases, according to researchers from MIT and Stanford University. Despite claims of an accuracy rate of more than 97 percent, the three programs’ error rates in determining the gender of light-skinned men were never worse than 0.8 percent. For darker-skinned women, however, the error rates ballooned — to more than 20 percent in one case and more than 34 percent in the other two. The discrepancy is explained by examining the data set used to assess its performance-it was more than 77 percent male and more than 83 percent white.