Cameras Allow NYPD to Track People Throughout City: Report ( Alexandra Schuler/AP)
By Charlie McCarthy | Wednesday, 09 June 2021 08:16 AM
Surveillance cameras throughout New York City allow police to track citizens via invasive facial recognition software, Amnesty International reported.
The international organization's investigation found that a combined 15,280 surveillance cameras in Manhattan, Brooklyn, and the Bronx enable the New York City Police Department to track people.
"This sprawling network of cameras can be used by police for invasive facial recognition and risk turning New York into an Orwellian surveillance city," said Matt Mahmoudi, Artificial Intelligence & Human Rights Researcher at Amnesty International.
"You are never anonymous. Whether you’re attending a protest, walking to a particular neighborhood, or even just grocery shopping — your face can be tracked by facial recognition technology using imagery from thousands of camera points across New York."
Volunteers tagged 8,220 cameras at intersections in Brooklyn, 3,590 in Manhattan, and 3,470 in the Bronx — almost half (47%) of the intersections in New York City.
The ongoing project is collecting data on cameras in the remaining two boroughs, Queens and Staten Island, though Amnesty International said 38,831 locations across the city had been tagged.
Camera imagery run through facial recognition technology (FRT) allows the NYPD to track every individual who has been captured visually as they move throughout the city.
Amnesty International used 3D modeling to calculate the estimated distance by which an NYPD Argus camera could capture video footage processable by facial recognition software. Results showed cameras potentially could capture faces in high definition up to 650 feet (two blocks) away.
Critics of FRT claim it's an invasion of privacy and point to the Chinese government's penchant for using FRT.
"China uses facial recognition to profile Uyghur individuals, classify them on the basis of their ethnicity, and single them out for tracking, mistreatment, and detention," 17 senators from both sides of the aisle charged in a letter to then-Secretary of State Mike Pompeo in March 2020.
"And these technologies are deployed in service of a dystopian vision for technology governance, that harnesses the economic benefits of the internet in the absence of political freedom and sees technology companies as instruments of state power."
FRT compares camera imagery with millions of faces stored in its databases, many scraped from sources, including social media, without users’ knowledge or consent. According to Amnesty International, the technology has amplified racially discriminatory policing, and can threaten the rights to freedom of peaceful assembly and privacy.
The NYPD last summer searched the apartment of Black Lives Matter activist Derrick Ingram, who allegedly had yelled in an officer's ear through a megaphone during a demonstration. The Reason Foundation said Ingram was identified through FRT that compared his image at the protest against photos on his Instagram account.
The report said the NYPD has used FRT in 22,000 cases since 2017.
East New York in Brooklyn — which is 54.4% Black, 30% Hispanic, and 8.4% White according to the latest census data — was the most surveilled (577 cameras) neighborhood in the three boroughs, according to Amnesty International.
The investigation began in early May and involved more than 5,500 volunteers from around the world as part of the innovative Amnesty Decoders platform.
Cameras attached to buildings were considered to be privately owned. Cameras, such as dome cameras, attached to street lights, traffic signals, or roadside poles, most likely were owned by a government agency with the permission and access to install them.
Clearview AI, a major provider of FRT, has accumulated billions of images while, as of early 2020, the FBI's own database contained 640 million, the Reason Foundation said Tuesday.
Buzzfeed reported in April that 1,803 publicly funded agencies that have used or are using Clearview's FRT tools.
Clearview AI faces a lawsuit in California over its scraping practices and left Canada after being investigated by that country's privacy commissioner.