N.Y. City’s Facial Recognition Surveillance System Targets Minorities: Forbes

Racial minorities are “more at risk” of being spied on by New York City’s massive facial recognition surveillance machine.

QUICK FACTS:
  • New York City’s network of surveillance cameras captures images of citizens’ faces which are then run through facial recognition software.
  • The chances your face will be run through the system is even higher if you’re in a predominantly Black, Asian, or Hispanic community, according to Forbes.
  • An Amnesty International study analyzed 7,000 volunteers from over 150 countries looking through Google Maps images of 43,400 intersections across New York City, marking each camera they could see.
  • The group uncovered 25,000 cameras in total.
  • Brooklyn was the most heavily populated with 9,230 cameras across the boroughs, followed by Queens at 7,580.
  • The research is an expansion on Amnesty’s CCTV tracking project that previously found 15,000 cameras across New York City.
  • Amnesty found that for areas in Bronx, Brooklyn and, Queens, “the research also showed that the higher the proportion of non-white residents, the higher the concentration of facial recognition compatible CCTV cameras.”
  • Amnesty put together a map showing how many cameras would be capturing a person’s image as they walk a given route and possibly putting them within the gaze of private or government facial recognition surveillance.
WHAT AMNESTY ANALYSTS ARE SAYING:
  • Matt Mahmoudi, researcher and adviser at Amnesty International on artificial intelligence and human rights, said he hopes that rather than encouraging people from trying to hide from facial recognition, the research will compel citizens to call for legal action to ban the technology, Forbes reports.
  • “Our hope with this … is to get people to write to their council persons and pressure them to introduce the bill that a coalition of civil society organizations and ourselves are trying to get introduced into the council in the next few months. And it’s also a call for them to hold the NYPD accountable,” Mahmoudi said.
HARVARD CIVIL RIGHTS–CIVIL LIBERTIES LAW REVIEW’ ON FACIAL RECOGNITION:
  • “Technology frequently progresses faster than legal institutions are able to keep up. Facial surveillance – the use by police and other entities of technology which can recognize people and identify them by their faces – is one such area,” reads an article from the Harvard Civil Rights–Civil Liberties Law Review, published by Harvard Law School. “Facial recognition technologies, generally speaking, are not too far from what is used to unlock the newer versions of the iPhone. In the context of government and law enforcement use, facial recognition or surveillance refers to practices like capturing information about individuals whereabouts and activities without their consent, and often without their knowledge. Police departments and other law enforcement bodies claim that use of these emerging technologies will help the departments engage more effectively with communities and do their work better. However, despite all the comforting rhetoric, the use of facial recognition technology for law enforcement purposes represents significant infringements on peoples’ privacy and nonconsensual surveillance. This ability for the government to have unprecedented power to track people in their daily business and without cause is ‘incompatible with a healthy democracy.‘ Much like other forms of technology and tools used by law enforcement, if unchecked, these tools will harm, rather than help, communities. In particular, they will reinforce the crisis of racial bias in the criminal legal system.”
  • “The use of facial surveillance technologies means our privacy rights are at risk. Without regulation, police departments and other law enforcement bodies will be able to use systems with questionable-at-best accuracy to surveil communities without notice and perhaps without cause. Even if these technologies operated effectively, they should still give us pause. Concerns about facial recognition and surveillance programs are not mere technophobia – they represent well-founded worries about misuse of government data and the risks of such data being breached. As these technologies stand now, the high degrees of error are likely to disproportionately harm communities of color, which are already more likely to be subject to injustices of the criminal legal system. Given these harmful effects which threaten our civil liberties, these technologies can no longer go unchecked.”
BACKGROUND:
  • The study from Amnesty comes after the Surveillance Technology Oversight Project (STOP), in August last year, called out the NYPD’s purchase of more than $277 million in secret surveillance equipment that had previously been hidden from the public, Forbes notes.
  • The contracts were “hidden” in the Special Expenses program, “a controversial secrecy agreement that was terminated,” STOP said.
  • The NYPD has historically been one of the more active users of facial recognition, as for example in 2020, when it emerged via STOP that the police department had carried out 22,069 facial recognition searches over the previous three years.

LATEST VIDEO