White surveillance camera monitoring a dark space with lined ceiling and overhead lights

Racial bias in facial recognition algorithms

One of the most pressing threats to human rights and racial justice is the proliferation of racist facial recognition technology. March 21st is the International Day Against Racial Discrimination. This is a day to commit to understanding how systemic racism operates and take action toward a better future for all of us.

In Amnesty International Canada’s new podcast series, Rights Back At You, we focus on anti-Black racism. We examine how policing, surveillance and technology collide to perpetuate racial discrimination.

Global demonstrations against police violence in 2020 renewed questions about the impacts of surveillance technology on protesters, particularly Black protesters. In New York, one common protest route had approximately one hundred per cent police CCTV coverage, according to Amnesty International’s Ban the Scan campaign.

Facial recognition technology

Facial recognition is a biometric tool designed to recognize faces. It’s software that uses photos to identify a face.

It maps your facial features, measuring things like the shape of your nose or the distance between your eyes, and then compares the results to another image for verification— kind of like how you might unlock your phone. Or, it compares it to many images in a database, like comparing a photo of a person from a protest to a database of driver’s license photos connected to an address.

Is facial recognition racist?

The results of facial recognition algorithms are notoriously inaccurate and racist.

A study done by the federal government in the United States showed that African American and Asian faces were up to 100 times more likely to be misidentified than white faces and the highest false-positive rate was among Native Americans.

Research from the University of Essex in the UK showed that the technology they tested was accurate in just 19% of cases.

And a groundbreaking study by a trio of Black women (Joy Buloamwini, Timnit Gebru, and Deborah Raji) showed the facial recognition technology they tested performed the worst when recognizing Black faces— especially Black women’s faces.

In Canada, law enforcement agencies have violated privacy rights by using facial recognition software, monitoring the public on social media, and police departments across the country are rolling out the use of body cameras (portable surveillance video devices) to gather video footage of police interactions. Some body cameras have the potential for facial recognition technology.

These surveillance activities raise major human rights concerns when there is evidence that Black people are already disproportionately criminalized and targeted by the police. Facial recognition technology affects Black protesters in many ways.

Discrimination in facial recognition technology

The use of facial recognition technology in policing can perpetuate and exacerbate existing racial biases and discrimination. Black protesters are often subjected to greater scrutiny and suspicion, leading to heightened levels of harassment and arrest. Carding (or “street checks”) is a persistent racial discrimination problem in Canada where Black people, Indigenous people, and people of colour are more frequently arbitrarily stopped and questioned by the police than white people.

Misidentification in facial recognition technology

Facial recognition systems misidentify Black faces at a high rate. Facial recognition is less accurate in identifying people with darker skin tones—especially women. This can result in the misidentification of Black protesters or false positive matches in image databases. In some cases, police have wrongfully arrested people.

Surveillance and risk to freedom of expression

Law enforcement agencies may use facial recognition technology to monitor and track protesters, which can have a chilling effect on exercising freedom of expression and freedom of assembly rights. Black protesters are more likely to be targeted for surveillance due to racial profiling and systemic racism, especially in the context of the Black Lives Matter movement against police violence and community calls to defund the police.

Facial recognition technology in the context of protests and policing raises serious concerns about human rights and racial discrimination.

Technology-facilitated discrimination

The inaccuracy of facial recognition technology means that Black people are at increased risk of discrimination and human rights impacts, especially when paired with systemic racism in policing. Research suggests that the poor identification of facial recognition technology when it comes to people with darker skin is because the images used to train the systems are predominantly white faces.

You might think the easy solution is to just include more diversity in the training systems to make them work better. But there’s a bigger question here: do we want it to work well for policing Black people? For policing anyone? And fundamentally, do we want it to exist at all?

In Detroit, in 2020, the police arrested a Black man named Robert Williams based on a facial recognition identification. He hadn’t done anything wrong.

Introducing biased technology into contexts where racial discrimination already occurs will only exacerbate the problem.

Black people in Canada experience disproportionate levels of police violence and incarceration. Indigenous people in Canada are also disproportionately street checked, harmed, and incarcerated.

Groups that are already over-policed and criminalized will experience negative impacts from facial recognition technology.

Overall, more surveillance does not mean more safety

We are often told that more cameras and more surveillance make us safer. This is not necessarily true. Surveillance and facial recognition technology threaten our rights to privacy, non-discrimination, freedom of assembly, freedom of association, and freedom of expression.

This constitutes a threat to democracy and the ability to freely participate in social movements. This, of course, disproportionately affects people from groups that experience high levels of policing and discrimination.

Derrick Ingram
Derrick Ingram, founder of the Warriors in the Garden. Photo by Ira L. Black

Want to learn more?

Listen to Episode 1 of Rights Back At You on facial recognition and anti-Black racism. It tells the story of Derrick Ingram, a Black Lives Matter protester in New York.

The New York Police Department surrounded Derrick’s apartment after he posted a photo of himself from a protest. It seemed like the police had tracked him down using facial recognition technology.

If police can use surveillance technology to target activists like Derrick, will people think twice about speaking out? We chat with experts about the threats to our right to protest, what it looks like in Canada and how Black and Indigenous communities are worst impacted. Facial recognition use is racist when it works and also when it doesn’t work.

Amnesty’s Podcast – Rights Back At You

Amnesty International Canada’s new podcast deeply explores how policing, racism and surveillance converge.

In this series, we explore the impacts of surveillance on Black communities and protesters. We pass the mic to people working against white supremacy to build a better future for everyone. Together, we unravel the Canada you think you know and challenge the systems that hold back human rights.

Amnesty International Canada's Rights Back at You podcast cover art

Black Mental Health and Surveillance

Surveillance and facial recognition technology present a further threat to mental health. Being constantly watched and criminalized has a significant emotional and psychological impact. As you will hear from Derrick’s story in the podcast, the effects of police violence and surveillance are lingering and traumatic.

Amnesty International Canada held an event Mental Health and Us on February 1st, 2023 to discuss systemic racism and mental health in Black communities.

Want to take action?

Facial recognition is racist. Globally, Amnesty International is running campaigns to ban facial recognition and protect the right to protest.

Ban the Scan: Sign the petition and call for a ban on facial recognition technologies now. 

Join Amnesty International’s call to end technologies of mass surveillance, including facial recognition. It is intrusive, oppressive & racist. World governments must acknowledge and act to protect the rights of citizens across the globe. We must call on our governments to prevent the development, sale or use of facial recognition technologies.

Protect the Protest: Add your voice to protect our rights to protest around the world.

By coming together and ensuring that everyone – especially the marginalized and discriminated against – can participate in protests, we can create a more just and equal world.

Take the Torture Out of Protest: Join us in demanding safety for protesters.

Progress happens when we come together to demand change. We should be able to do this without fear of being harmed, hurt, or even killed by the misuse of policing equipment.