Surveillance city: NYPD can use more than 15,000 cameras to track people using facial recognition in Manhattan, Bronx and Brooklyn

  • The NYPD can feed images from 15,280 surveillance cameras in Manhattan, Brooklyn, and the Bronx into facial recognition software.
  • The NYPD has used facial recognition in 22,000 cases since 2017 – but refuses to disclose details. One of these cases involved the harassment and attempted arrest of Derrick ‘Dwreck’ Ingram.
  • Most surveilled neighbourhood revealed to be East New York, Brooklyn.
  • More than 5,500 volunteers from 144 countries took part in the project, contributing the equivalent of nearly 11 years’ worth of research.
  • Ahead of the New York City mayoral primaries, Amnesty International calls on authorities to ban facial recognition for identification purposes.

The New York City Police Department (NYPD) has the ability to track people in Manhattan, Brooklyn and the Bronx by running images from 15,280 surveillance cameras into invasive and discriminatory facial recognition software, a new Amnesty International investigation reveals today.

Thousands of volunteers from around the world participated in the investigation, tagging 15,280 surveillance cameras at intersections across Manhattan (3,590), Brooklyn (8,220) and the Bronx (3,470). Combined, the three boroughs account for almost half of the intersections (47%) in New York City, constituting a vast surface area of pervasive surveillance.

“This sprawling network of cameras can be used by police for invasive facial recognition and risk turning New York into an Orwellian surveillance city,” says Matt Mahmoudi, Artificial Intelligence & Human Rights Researcher at Amnesty International.

“You are never anonymous. Whether you’re attending a protest, walking to a particular neighbourhood, or even just grocery shopping – your face can be tracked by facial recognition technology using imagery from thousands of camera points across New York.”

East New York in Brooklyn, an area that is 54.4% Black, 30% Hispanic and 8.4% White according to the latest census data, was found to be the most surveilled neighbourhood in all three boroughs, with an alarming 577 cameras found at intersections.

Facial recognition technology threatens the right to protest and risks supercharging racist policing

The NYPD has used facial recognition technology (FRT) in 22,000 cases since 2017 — half of which were in 2019 alone. When camera imagery is run through FRT, the NYPD is able to track every New Yorker’s face as they move through the city.

FRT works by comparing camera imagery with millions of faces stored in its databases, many scraped from sources including social media without users’ knowledge or consent. The technology is widely recognized as amplifying racially discriminatory policing and can threaten the rights to freedom of peaceful assembly and privacy.

In the summer of 2020, it was likely used to identify and track a participant at a Black Lives Matter protest, Derrick ‘Dwreck’ Ingram, who allegedly shouted into a police officer’s ear.  Police officers were unable to produce a search warrant when they arrived at his apartment.

Amnesty International, and coalition partners of the Ban the Scan campaign, submitted numerous Freedom of Information Law (FOIL) requests to the NYPD, requesting more information about the extent of facial recognition usage in light of Dwreck’s case. They were dismissed, along with a subsequent appeal.

“There has been a glaring lack of information around the NYPD’s use of facial recognition software – making it impossible for New Yorkers to know if and when their face is being tracked across the city,” says Matt Mahmoudi.

“The NYPD’s issues with systemic racism and discrimination are well-documented – so, too, is the technology’s bias against women and people of colour. Using FRT with images from thousands of cameras across the city risks amplifying racist policing, harassment of protesters, and could even lead to wrongful arrests.”

“Facial recognition can and is being used by states to intentionally target certain individuals or groups of people based on characteristics, including ethnicity, race and gender, without individualized reasonable suspicion of criminal wrongdoing.”

Amnesty International research has modelled the extensive field of vision of New York’s CCTV network. For example, the intersection of Grand Street and Eldridge Street sits near the border of Chinatown and was near a key location in the Black Lives Matter protests. Our investigation found three NYPD-owned Argus cameras around the site, in addition to four other public cameras and more than 170 private surveillance cameras, which our modelling suggests have the capacity to track faces from as far as 200 metres away (or up to 2 blocks).

Digital army of volunteers exposes true scale of surveillance

More than 5,500 volunteers have participated in the investigation, launched on 4 May 2021 as part of the innovative Amnesty Decoders platform. The project is ongoing to collect data on the remaining two New York boroughs, but already volunteers have analysed 38,831 locations across the city.

It was a global effort – volunteers from 144 countries participated, with the largest group of volunteers (26%) being in the United States. In just three weeks, volunteers contributed an eye-watering 18,841 hours – more than 10 working years for a researcher working full time in the USA. Participants were given Google Street View images of locations around New York City and asked to tag cameras, with each intersection analysed by three volunteers. The total figures include a mix of both public and private cameras, both of which can be used with FRT. 

“The case is clear – people across the world are deeply concerned about the risk that facial recognition poses to our societies,” says Matt Mahmoudi.

“That is why Amnesty International, and coalition members from over a dozen NYC-based organisations, are calling on authorities to outlaw the use of facial recognition technology by all government agencies in New York City.”

“State Senators, City Councillors and prospective mayoral candidates have one choice – ban the scan, or risk turning New York into a surveillance city right out of the authoritarian playbook.”