Technology and Social Justice

           Panel Discussion  --------   Technology and Social Justice   --------  Panel Discussion        

                                               Facial Recognition Technology 

                 A flawed technology  whose outputs are used by Police with devastating results

 This is the first of a four-part series of panel discussions on how Technology impacts Social Justice This virtual hour-long discussion will explore Facial Recognition Technology and Law Enforcement's use of the flawed data it provides.  


                                              Deborah Raji, AI Now and  Rachel Murphy, ACLU

More detail about the topic - including links to articles and to a video - are below.  We will discuss the situation as it exists today and what can be done to remove bias from these systems. Please register for this free online event and join us on September 11 for this most interesting and timely discussion.  

                                                        More Information and Links


Facial recognition technology has come under fire for racial bias due to inaccurate facial matches, particularly when it comes to the faces of people of color. Facial recognition systems tend to exhibit the same prejudices and misperceptions held by their human programmers.
The systems falsely identified African-American and Asian faces 10 times to 100 times more than Caucasian faces, said the National Institute of Standards and Technology. Among a database of photos used by law enforcement agencies in the United States, the highest error rates came in identifying Native Americans, the study found.

While the Chicago Police Department dropped usage of facial recognition software, many states have not followed. This is concerning for people of color. Three commercially released facial-analysis programs from major technology companies demonstrate both skin-type and gender biases, according to researchers from MIT and Stanford University.  Despite claims of an accuracy rate of more than 97 percent, the three programs’ error rates in determining the gender of light-skinned men were never worse than 0.8 percent. For darker-skinned women, however, the error rates ballooned — to more than 20 percent in one case and more than 34 percent in the other two. The discrepancy is explained by examining the data set used to assess its performance-it was more than 77 percent male and more than 83 percent white.


· Networking
You can network with other members of the audience by selecting the Networking tab on the left side of your screen.
· Expo
You can visit the Expo Booths of our Sponsors by selecting the Expo tab on the left side of your screen.
· Sessions
This discussion explores facial recognition technology and law enforcement's use of the flawed results it provides.

The event is over

Hosted by

Real-Time Communications Laboratory

The IIT RTC Laboratory is a research- and teaching lab where industry and academia collaborate.


Real Time Communications Lab

Real Time Communications Lab

The Real-Time Communications (RTC) Lab at Illinois Tech - Where Industry and Academia Connect!

IIT Office of Community Affairs

IIT Office of Community Affairs

Building relationships with between Illinois Tech, local community-based organizations, businesses, Chicago Public Schools, in Bronzeville and the surrounding Chicago communities.

Bronzeville Historical Society

Bronzeville Historical Society

Preserving and protecting African American history and culture of Chicago

University of Illinois, Office of Community Relations

University of Illinois, Office of Community Relations

The Office of Community Relations works to build and maintain relationships between UIC and its neighboring communities.