Documents Show IBM Pitched The NYPD Facial Recognition Software With Built-In Racial Profiling Options

from the facial-profiling-amirite dept

Documents obtained by The Intercept show the NYPD and IBM engaged in a long-running facial recognition tech partnership from 2008 to 2016. While some of this deployment was discussed publicly, details about the extent of the program -- as well as it's more problematic elements -- haven't been.

As the article's title informs the reader, camera footage could be scanned for face matches using skin tone as a search constraint. Considering this was pushed by IBM as a tool to prevent the next 9/11, it's easy to see why the NYPD -- given its history of surveilling Muslim New Yorkers -- might be willing to utilize a tool like this to pare down lists of suspects to just the people it suspected all along (Muslims).

There are a number of surprises contained in the long, detailed article, but the first thing that jumps out is IBM's efforts and statements, rather than the NYPD's. We all know the government capitalizes on tragedies to expand its power, but here we see a private corporation appealing to this base nature to make a sale.

In New York, the terrorist threat “was an easy selling point,” recalled Jonathan Connell, an IBM researcher who worked on the initial NYPD video analytics installation. “You say, ‘Look what the terrorists did before, they could come back, so you give us some money and we’ll put a camera there.”

From this pitch sprung an 8-year program -- deployed in secrecy by the NYPD to gather as much footage as possible of New Yorkers for dual purposes: its own law enforcement needs and to serve as a testing ground for IBM's new facial recognition tech. Needless to say, New Yorkers were never made aware of their lab rat status in IBM's software development process.

Even though the software could search by skin tone (as well as by "head color," age, gender, and facial hair), the NYPD claims it never used that feature in a live environment, despite IBM's urging.

According to the NYPD, counterterrorism personnel accessed IBM’s bodily search feature capabilities only for evaluation purposes, and they were accessible only to a handful of counterterrorism personnel. “While tools that featured either racial or skin tone search capabilities were offered to the NYPD, they were explicitly declined by the NYPD,” Donald, the NYPD spokesperson, said. “Where such tools came with a test version of the product, the testers were instructed only to test other features (clothing, eyeglasses, etc.), but not to test or use the skin tone feature. That is not because there would have been anything illegal or even improper about testing or using these tools to search in the area of a crime for an image of a suspect that matched a description given by a victim or a witness. It was specifically to avoid even the suggestion or appearance of any kind of technological racial profiling.”

It's easy to disbelieve this statement by the NYPD, given its long history of racial profiling, but it may be those handling the secret program deployment actually understood no program remains secret forever and sought to head off complaints and lawsuits by discouraging use of a controversial search feature. It also may be the NYPD was super-sensitive to these concerns following the partial dismantling of its stop-and-frisk program and the outing of its full-fledged, unconstitutional surveillance of local Muslims.

The thing is IBM is still selling this tech it beta tested live from New York. The same features the NYPD rejected are used to sell other law enforcement agencies on the power of its biometric profiling software.

In 2017, IBM released Intelligent Video Analytics 2.0, a product with a body camera surveillance capability that allows users to detect people captured on camera by “ethnicity” tags, such as “Asian,” “Black,” and “White.”

And there's a counter-narrative that seems to dispute the NYPD's assertions about controversial image tagging features. The IBM researcher who helped develop the skin tone recognition feature is on record stating the company doesn't develop features unless there's a market for them. In his estimation, the NYPD approached IBM to ask for this feature while the 8-year pilot program was still underway. The NYPD may have opted out after the feature went live, but it may have only done so to steer clear of future controversy. An ulterior motive doesn't make it the wrong move, but it also shouldn't be assumed the NYPD has morphed into heroic defenders of civil liberties and personal privacy.

What's available to other law enforcement agencies not similarly concerned about future PR black eyes is "mass racial profiling" at their fingertips. IBM has built a product that appeals to law enforcement's innate desire to automate police work, replacing officers on the street with cameras and software. Sure, there will be some cameras on patrol officers as well, but those are just for show. The real work of policing is done at desks using third-party software that explicitly allows -- if not encourages -- officers to narrow down suspect lists based on race. In a country so overly concerned about terrorism, this is going to lead to a lot of people being approached by law enforcement simply because of their ethnicity.

An additional problem with IBM's software -- and with those produced by competitors -- is a lot of markers used to identify potential suspects can easily net a long list of probables who share nothing but similar body sizes or clothing preferences. Understandably, more work is done by investigators manning these systems before cops start rounding people up, but the potential for inadvertent misuse (never mind actual misuse) is still incredibly high.

The secrecy of these programs is also an issue. Restrictive NDAs go hand-in-hand with private sector partnerships and these are often translated by police officials to mean information must be withheld from judges, criminal defendants, and department oversight. When that happens, due process violations gather atop the privacy violation wreckage until the whole thing collapses under its own audacity. Nothing stays secret forever, but entities like the NYPD and IBM could do themselves a bunch of favors by engaging in a little proactive transparency.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: facial recognition, new york, nypd, racial profiling
Companies: ibm


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    That Anonymous Coward (profile), 10 Sep 2018 @ 3:34am

    "It was specifically to avoid even the suggestion or appearance of any kind of technological racial profiling."

    Did a cooler head prevail by pointing out the sheer number of times y'all were racially profiling people with no evidence beyond skin tone & religious beliefs?

    Did the huge piles of articles showing that other countries really don't want your 'experts' on the scene of terrorism events b/c they immediately try to blame the brown guy?

    Perhaps we would all be better off if we stopped having 'secret' programs that we hide from the view of the courts.
    If you aren't willing to bring it to court, why are you using it?
    Is it because the court would tear your case apart??

    This whole we can't let the terrorists know how our programs work cover story is wearing really thin.
    If you look a majority of these programs are at their heart the darker your skin the fewer rights you have.

    Perhaps if we stopped the cops from treating those of darker skin tones so poorly, those communities would trust them more to speak out when they see something is wrong. As it stands now we've seen them sneak a CI into a mosque who so alarmed the members they called the FBI about him & they left the CI in place a while longer. Can you imagine how far over the top the CI needed to be that Muslims would reach out to the FBI (who used those all Muslims are evil training manuals until it was reported on) for help knowing that they themselves would most likely be on the receiving end of a rights trampling colonoscopy?

    link to this | view in thread ]

  2. icon
    JoeCool (profile), 10 Sep 2018 @ 5:27am

    Riiiiiight

    “Where such tools came with a test version of the product, the testers were instructed only to test other features (clothing, eyeglasses, etc.), but not to test or use the skin tone feature. That is not because there would have been anything illegal or even improper about testing or using these tools to search in the area of a crime for an image of a suspect that matched a description given by a victim or a witness. It was specifically to avoid even the suggestion or appearance of any kind of technological racial profiling.”

    Because we all know how good the NYPD is at following directions. ;)

    link to this | view in thread ]

  3. identicon
    Anonymous Coward, 10 Sep 2018 @ 6:28am

    As a white guy with a suntan I strongly object to this.

    But seriously, look how profiling worked for Jean Charles de Menezes, the Brazilian electrician on his way to work on the London Underground who was shot and killed by terrorism officers (should I have put anti- in there or not?) for no other reason than his skin colour.

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 10 Sep 2018 @ 6:47am

    Not asking the important questions...

    Did they include phrenology in the facial recognition?

    I mean, once you've measured the citizens' faces to that level of precision, you should be able to tell who is and is not a criminal just by measuring the space between the eyes.

    link to this | view in thread ]

  5. identicon
    Talmyr, 10 Sep 2018 @ 7:00am

    IBM, as if!

    This is the same IBM that worked flat-out to automate the process of reducing numbers of certain German and Eastern European populations deemed "undesirable" 80 years ago, right?

    link to this | view in thread ]

  6. identicon
    I.T. Guy, 10 Sep 2018 @ 7:29am

    link to this | view in thread ]

  7. identicon
    ryuugami, 10 Sep 2018 @ 7:53am

    IBM building systems for racial profiling? Yup, that doesn't ring any alarm bells. None whatsoever. Yeah. Everything's fine.

    link to this | view in thread ]

  8. icon
    ShadowNinja (profile), 10 Sep 2018 @ 8:19am

    Racial profiling is worthless against so few terrorists

    There's a certain paradox that law enforcement should be aware of, that explains why this racial profiling software is bound to fail miserably in practice.

    Imagine if there's a deadly virus that has only infected a tiny percentage of the population (a fraction of a percent). If left untreated it could spread a ton of other people and cause massive harm. But lets say we have a test that's 99% accurate at diagnosing who has this deadly disease.

    You might think with those numbers that we could safely find everyone with the disease if everyone is willing to get screened for it. But that's not true at all. Because such a tiny percentage of the population actually has the deadly disease, even a 99% accurate test for it would be worthless, it would flag far more people as having the disease then who actually have it. Even testing people a second time to verify the original test results wouldn't stop this problem, with such a tiny percentage of the population actually infected you'd still find far more people who don't have it then who do have it.

    This is essentially the situation with stopping terrorists. Law enforcement loves to rely on the crux of racial profiling to find terrorists, but because there's so few of them, and so many non-terrorists (over 300 million in the US alone), their racial profiling to find them becomes worthless because it will virtually always find non-terrorists instead.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 10 Sep 2018 @ 8:23am

    Good to see that this was an IBM project

    IBM has a storied history of projects going over deadline and over budget, then ultimately being scrapped for failure to perform. If we're going to have an unconstitutional policing program, I can't think of any company I'd rather have failing on it than IBM.

    link to this | view in thread ]

  10. icon
    Bamboo Harvester (profile), 10 Sep 2018 @ 8:52am

    Re: Racial profiling is worthless against so few terrorists

    Well said, but your analogy isn't quite right.

    Would you "racial profile" your medical screening if the Killer Disease was Sickle Cell Anemia?

    I've seen people actually rant about how some DRUGS are racist because MAO's are "specified for Black Men!!!" over other BP drugs.

    I think the outrage over this is overboard. The assumption is that crowds are scanned looking ONLY for "potential threats".

    Nobody wants to sit in front of monitors day in day out watching random crowds - video review is ALWAYS targeted.

    If a store is robbed and the criminals are caught on video, skin (or head) color is a MAJOR factor in recognition software. Mainly because it usually *removes* 99% of the potentials.

    I don't know how fine the "color match" can be tuned, but I suspect it's quite fine.

    So, what's next, cops will be banned from posting over the radio that their "suspect is a black male 25-30 wearing a gray hoodie"?

    After all, that's racist, sexist, and ageist profiling. Not to mention the fashion police....

    link to this | view in thread ]

  11. icon
    OldMugwump (profile), 10 Sep 2018 @ 9:18am

    Nothing wrong with using skin color as a search qualifier

    ...if you're searching for a specific suspect.

    If you have a report that the suspect in question is 7'2" tall, with blue hair, purple eyes, and green skin, then that's what you look for. That's not racial profiling.

    If you start hassling random green-skinned people for no reason, that's racial profiling.

    link to this | view in thread ]

  12. identicon
    SirWired, 10 Sep 2018 @ 9:35am

    It's a perfectly normal search constraint.

    "Skin Color" is a perfectly normal thing to search for when trying to find a suspect for a particular crime; it's no more problematic than gender, dress, facial hair, etc.

    Of course it should not be used for "predictive" policing (e.g. "The more brown people, the more officers"), but if looking for a single person, it's perfectly reasonable.

    On another note, the whole bit about IBM and the Holocaust was completely overblown, nor was it any kind of secret. IBM did work on the German census, and religion was one of the questions on said census, but that is a perfectly normal census question. While it's not on the US census, it most certainly is on census questionnaires in many countries even today, including such despotic regimes like Australia and the UK. (This usually comes up when some smart-ass tries to get 'Jedi' included in the official statistics.)

    link to this | view in thread ]

  13. identicon
    Will B., 10 Sep 2018 @ 9:46am

    Hit the nail on the head, Mugwump:

    >If you start hassling random green-skinned people for no reason, that's racial profiling.

    Yeah, that's exactly what the NYPD tends to do, which is why people are concerned.

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 10 Sep 2018 @ 9:47am

    Re: It's a perfectly normal search constraint.

    It is now acceptable, iirc, down under to claim Jedi Knight as your official religion.

    One has to wonder what it is about such questions that makes it something of importance, to whom ever thinks it important. Why does anyone in government need such information and to what purpose is it used?

    Is something considered normal just because other people are doing it?

    link to this | view in thread ]

  15. icon
    Zgaidin (profile), 10 Sep 2018 @ 11:18am

    Re: Re: It's a perfectly normal search constraint.

    Like all information, demographic information about things like religion, sexual orientation, ethnicity, etc can be used for good or ill. In our rightly suspicious view of governments - it's hard to see how it can possibly be good, but considering census data is public - it can actually be a powerful tool against governments. For example, how do you point out that a wildly disproportionate percentage of prison inmates in the US are black males, if you don't know what percentage of the total population they constitute for comparative purposes?

    link to this | view in thread ]

  16. icon
    Zgaidin (profile), 10 Sep 2018 @ 11:26am

    It Depends... ?

    Yes, NYPD has a bad history of racial profiling and discrimination and they should be viewed with suspicion on those grounds. However, as others have pointed out, it depends entirely on how this software was being used.

    If they were using it to identify "hot spots" of certain ethnic concentrations to inform officer deployment, yeah that's bad. You can't do that.

    If they were retaining the footage so they could go back when searching for a specific subject, trying to find historical clues as to his or her whereabouts, that's bad too - at least in my mind but for reasons unrelated to race. At that point, race is just a filter on the database query to speed the process.

    If they're using it on live footage to look for a specific suspect based on physical description, that just makes sense. If, however, they are NOT doing that "specifically to avoid even the suggestion or appearance of any kind of technological racial profiling" then that's also bad, because it's stupid. If you know your suspect is a "short, white male, 35-45, light brown hair," don't waste time having the system look at where all the tall, brown, black, quite young, quite old, or red-headed people are. That's pointless.

    link to this | view in thread ]

  17. icon
    JoeCool (profile), 10 Sep 2018 @ 11:40am

    Re: Re: Racial profiling is worthless against so few terrorists

    So, what's next, cops will be banned from posting over the radio that their "suspect is a black male 25-30 wearing a gray hoodie"?

    Yep! :)

    https://www.youtube.com/watch?v=tQByeGkJJSc

    link to this | view in thread ]

  18. icon
    ECA (profile), 10 Sep 2018 @ 12:11pm

    Can we??

    Use this on all the Major corps and Bookkeepers??
    All those WHITE GUYS, that take about 10000 times what any person would get from a 7/11..
    Robbing the retirement funds and Bank accounts of MANY..

    link to this | view in thread ]

  19. identicon
    Lawrence D’Oliveiro, 10 Sep 2018 @ 4:19pm

    Re: "Skin Color" is a perfectly normal thing to search for

    Not if it’s going to flood your work queue with false positives.

    Unless, of course, false positives of a certain skin colour are exactly what you want to have...

    link to this | view in thread ]

  20. identicon
    Anonymous Coward, 10 Sep 2018 @ 7:44pm

    Re: Re: Re: It's a perfectly normal search constraint.

    If the government is not supposed to discriminate based upon religion, why do they want to know ... or even care what a person's religion is?

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 11 Sep 2018 @ 1:57am

    Re: IBM, as if!

    The same one, yep.

    link to this | view in thread ]

  22. identicon
    Wendy Cockcroft, 11 Sep 2018 @ 5:37am

    Re:

    Agreed. How utterly horrifying!

    Also clothing label Hugo Boss, electronics company Krupp and these companies: http://www.holocaustresearchproject.org/economics/igfarben.html

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.