Documents Show IBM Pitched The NYPD Facial Recognition Software With Built-In Racial Profiling Options
from the facial-profiling-amirite dept
Documents obtained by The Intercept show the NYPD and IBM engaged in a long-running facial recognition tech partnership from 2008 to 2016. While some of this deployment was discussed publicly, details about the extent of the program -- as well as it's more problematic elements -- haven't been.
As the article's title informs the reader, camera footage could be scanned for face matches using skin tone as a search constraint. Considering this was pushed by IBM as a tool to prevent the next 9/11, it's easy to see why the NYPD -- given its history of surveilling Muslim New Yorkers -- might be willing to utilize a tool like this to pare down lists of suspects to just the people it suspected all along (Muslims).
There are a number of surprises contained in the long, detailed article, but the first thing that jumps out is IBM's efforts and statements, rather than the NYPD's. We all know the government capitalizes on tragedies to expand its power, but here we see a private corporation appealing to this base nature to make a sale.
In New York, the terrorist threat “was an easy selling point,” recalled Jonathan Connell, an IBM researcher who worked on the initial NYPD video analytics installation. “You say, ‘Look what the terrorists did before, they could come back, so you give us some money and we’ll put a camera there.”
From this pitch sprung an 8-year program -- deployed in secrecy by the NYPD to gather as much footage as possible of New Yorkers for dual purposes: its own law enforcement needs and to serve as a testing ground for IBM's new facial recognition tech. Needless to say, New Yorkers were never made aware of their lab rat status in IBM's software development process.
Even though the software could search by skin tone (as well as by "head color," age, gender, and facial hair), the NYPD claims it never used that feature in a live environment, despite IBM's urging.
According to the NYPD, counterterrorism personnel accessed IBM’s bodily search feature capabilities only for evaluation purposes, and they were accessible only to a handful of counterterrorism personnel. “While tools that featured either racial or skin tone search capabilities were offered to the NYPD, they were explicitly declined by the NYPD,” Donald, the NYPD spokesperson, said. “Where such tools came with a test version of the product, the testers were instructed only to test other features (clothing, eyeglasses, etc.), but not to test or use the skin tone feature. That is not because there would have been anything illegal or even improper about testing or using these tools to search in the area of a crime for an image of a suspect that matched a description given by a victim or a witness. It was specifically to avoid even the suggestion or appearance of any kind of technological racial profiling.”
It's easy to disbelieve this statement by the NYPD, given its long history of racial profiling, but it may be those handling the secret program deployment actually understood no program remains secret forever and sought to head off complaints and lawsuits by discouraging use of a controversial search feature. It also may be the NYPD was super-sensitive to these concerns following the partial dismantling of its stop-and-frisk program and the outing of its full-fledged, unconstitutional surveillance of local Muslims.
The thing is IBM is still selling this tech it beta tested live from New York. The same features the NYPD rejected are used to sell other law enforcement agencies on the power of its biometric profiling software.
In 2017, IBM released Intelligent Video Analytics 2.0, a product with a body camera surveillance capability that allows users to detect people captured on camera by “ethnicity” tags, such as “Asian,” “Black,” and “White.”
And there's a counter-narrative that seems to dispute the NYPD's assertions about controversial image tagging features. The IBM researcher who helped develop the skin tone recognition feature is on record stating the company doesn't develop features unless there's a market for them. In his estimation, the NYPD approached IBM to ask for this feature while the 8-year pilot program was still underway. The NYPD may have opted out after the feature went live, but it may have only done so to steer clear of future controversy. An ulterior motive doesn't make it the wrong move, but it also shouldn't be assumed the NYPD has morphed into heroic defenders of civil liberties and personal privacy.
What's available to other law enforcement agencies not similarly concerned about future PR black eyes is "mass racial profiling" at their fingertips. IBM has built a product that appeals to law enforcement's innate desire to automate police work, replacing officers on the street with cameras and software. Sure, there will be some cameras on patrol officers as well, but those are just for show. The real work of policing is done at desks using third-party software that explicitly allows -- if not encourages -- officers to narrow down suspect lists based on race. In a country so overly concerned about terrorism, this is going to lead to a lot of people being approached by law enforcement simply because of their ethnicity.
An additional problem with IBM's software -- and with those produced by competitors -- is a lot of markers used to identify potential suspects can easily net a long list of probables who share nothing but similar body sizes or clothing preferences. Understandably, more work is done by investigators manning these systems before cops start rounding people up, but the potential for inadvertent misuse (never mind actual misuse) is still incredibly high.
The secrecy of these programs is also an issue. Restrictive NDAs go hand-in-hand with private sector partnerships and these are often translated by police officials to mean information must be withheld from judges, criminal defendants, and department oversight. When that happens, due process violations gather atop the privacy violation wreckage until the whole thing collapses under its own audacity. Nothing stays secret forever, but entities like the NYPD and IBM could do themselves a bunch of favors by engaging in a little proactive transparency.
Filed Under: facial recognition, new york, nypd, racial profiling
Companies: ibm