New Report On Predictive Policing Shows How New Tech Is Giving Us Little More Than The Same Old Racism

from the recycling-racism dept

The National Association of Criminal Defense Lawyers has just released an in-depth examination of predictive policing. Titled "Garbage In, Gospel Out," it details the many ways bad data based on biased policing has been allowed to generate even more bad data, allowing officers to engage in more biased policing but with the blessing of algorithms.

Given that law enforcement in this country can trace itself back to pre- and post-Civil War slave patrols, it's hardly surprising modern policing -- with all of its tech advances -- still disproportionately targets people of color. Operating under the assumption that past performance is an indicator of future results, predictive policing programs (and other so-called "intelligence-led" policing efforts) send officers to places they've already been several times, creating a self-perpetuating feedback loop that ensures the more often police head to a certain area, the more often police will head to a certain area.

As the report [PDF] points out, predictive policing is inadvertently accurately named. It doesn't predict where crime will happen. It only predicts how police will behave.

If crime data is to be understood as a “by-product of police activity,” then any predictive algorithms trained on this data would be predicting future policing, not future crime. Neighborhoods that have been disproportionately targeted by law enforcement in the past will be overrepresented in a crime dataset, and officers will become increasingly likely to patrol these same areas in order to “observe new criminal acts that confirm their prior beliefs regarding the distributions of criminal activity.” As the algorithm becomes increasingly confident that these locations are most likely to experience further criminal activity, the volume of arrests in these areas will continue to rise, fueling a never-ending cycle of distorted enforcement.

This loop bakes racism into the algorithm, "tech-washing" (as the NACDL puts it) the data to give it the veneer of objectivity. The more this happens, the worse it gets. Neighborhoods become "high crime" areas and police respond accordingly, having convinced themselves that looking busy is preferable to fighting crime. Millions of tax dollars are spent creating these destructive loops -- a perverse situation that asks taxpayers to fund their own misery.

Once an area is determined to be worthy of constant police presence, those living in these areas can expect to have their rights and liberties curtailed. And courts have sometimes agreed with these assessments, allowing officers to treat entire neighborhoods as inherently suspicious, allowing them to engage in searches and questioning of people who just happened to be in the "wrong place" at literally any time. And that's unlikely to improve until courts start asking tough questions about predictive policing programs.

Data-driven policing raises serious questions for a Fourth Amendment analysis. Prior to initiating an investigative stop, law enforcement typically must have either reasonable suspicion or probable cause. Does a person loitering on a corner in an identified “hotspot” translate to reasonable suspicion? What if that person was identified by an algorithm as a gang member or someone likely to be involved in drug dealing or gun violence? Can an algorithm alone ever satisfy the probable cause or reasonable suspicion requirement? The lack of transparency and clarity on the role that predictive algorithms play in supporting reasonable suspicion determinations could make it nearly impossible to surface a Fourth Amendment challenge while replicating historic patterns of over-policing.

Rights abridged before and after, all in the name of "smarter" policing which greatly resembles more analog methods like "broken windows policing" or numerous stop-and-frisk programs that allowed officers to stop and search nearly anyone for nearly no reason. The only difference is how much is being spent and how likely it is that cops and their oversight will believe it's "smarter" just because it's attached to thousands of dollars of computer equipment.

There's more to the report than this. This barely touches the surface. There are numerous problems with these systems, including the fact they're proprietary, which means the companies behind them won't allow their software to be examined by defendants and the programs themselves are rarely subject to oversight by either the departments using them or the city governments presiding over these police departments.

Data-driven policing is faulty because it relies on faulty data. It's as simple as that. Here are just a few examples of how "smarter" policing is actively harming the communities it's deployed in against.

In response to such advances in crime-mapping technologies, researchers have discovered that the underlying mathematical models are susceptible to “runaway feedback loops, where police are repeatedly sent back to the same neighborhoods regardless of the actual crime rate” as a byproduct of biased police data…

Bad data also infects other efforts by police departments, like potential sources of useful intel like gang databases, which have been allowed to become landfills for garbage inputs.

For example, CalGang, a database widely used in California, listed 42 infants under the age of 1 as active gang members. Moreover, because there is “no clear, consistent and transparent exit process” for those on the database, it can be assumed that a significant proportion of “gang” designees were added in their teens and preteens. The Chicago Police Department (CPD)’s database includes more than 7,700 people who were added to the database before they turned 18, including 52 children who were only 11 or 12 years old at the time of their inclusion. An investigation published by The Intercept identified hundreds of children between the ages of 13 and 16 listed in the New York Police Department (NYPD)’s gang database in 2018.

The programs have proven so useless in some cases that cities that have long relied on predictive policing programs are dumping them.

The SSL program was “dumped” by the CPD [Chicago PD] in 2020 after a report published by the City of Chicago’s Office of the Inspector General (OIG) concluded that the SSL had not been effective in reducing violence, and that “of the 398,684 individuals recorded in one version of the model, only 16.3 percent were confirmed to be members of gangs.” In her meeting with the Task Force, Jessica Saunders, formerly a researcher at the RAND Corporation, additionally noted that there was no evidence that any person-based predictive policing strategies like the SSL had proven “effective” by any metrics.

The biggest lie in all of this isn't how it's portrayed to outsiders. It's the lie law enforcement agencies tell themselves: that data-driven policing is better and smarter than the way they used to do things. But it's just the same things they've always done. The tech doesn't give them an edge. It just confirms their biases.

As legal scholar Elizabeth Joh noted in her conversation with the Task Force, the discussion surrounding big data policing programs often assumes that the police are the consumers, or the “end users,” of big data, when they themselves are generating much of the information upon which big data programs rely from the start. Prior to being fed into a predictive policing algorithm, crime data must first be “observed, noticed, acted upon, collected, categorized, and recorded” by the police. Therefore, “every action – or refusal to act – on the part of a police officer, and every similar decision made by a police department, is also a decision about how and whether to generate data."

Data-driven policing is pretty much indistinguishable from non-data-driven policing. The only difference is how much is being spent on useless tech and what police officers and supervisors are telling themselves to maintain the illusion that biased policing can actually increase public safety and reduce crime.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: police, predictive policing, racisim
Companies: nacdl


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 24 Sep 2021 @ 12:13pm

    I thought that was why they were buying it in the first place

    The uses seem to show that is why they are getting it in the first place. They only use it to harass or justify harsh sentences and don't even give token thoughts to prevention in even their spuriously correlation is causation logic.

    link to this | view in chronology ]

  • icon
    Ninja (profile), 24 Sep 2021 @ 1:18pm

    Structural racism is deep ingrained in society and I'm not talking about the US alone. It starts by screwing up black children by denying them the same opportunities white kids have going up to screwing them by paying lower wages for the same positions or flat out denying them employment and forcing them towards crime as means of surviving. Then it further screws them with a biased judicial system that's rigged into piling up accusations while denying means of defending themselves which lands them in a completely dysfunctional prison system with punishment as its only goal. And good luck if they decide to be good persons, follow the law after serving prison time, innocent or guilty, because nobody is gonna employ them or they'll be subject to atrocious labor conditions and wages.

    I'm describing far too many countries. Civilization my ass.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 24 Sep 2021 @ 2:19pm

      Re:

      Structural racism is deep ingrained in society and I'm not talking about the US alone. It starts by screwing up black children

      If you accept that children can be factually categorized as "black", you're part of the problem. The (largely but not exclusively) American concept of "race" is pseudoscience—an invented system of oppression disguised as genetic science. It's like in India, where it's illegal to discriminate by caste, except where that's mandatory in the interest of helping out the disadvantaged castes. You can't tell children they're advantaged or disadvantaged by virtue of skin color (or whatever else), and expect that not to "screw them up" in some way. American children probably wouldn't treat skin color as anything more important that eye color or shoe size if not for what they learn from adults.

      link to this | view in chronology ]

      • icon
        Ninja (profile), 27 Sep 2021 @ 1:45pm

        Re: Re:

        By black children I mean minorities children.
        "You can't tell children they're advantaged or disadvantaged by virtue of skin color (or whatever else), and expect that not to "screw them up" in some way. American children probably wouldn't treat skin color as anything more important that eye color or shoe size if not for what they learn from adults." <<< no, the kids are taught about bigotry and prejudice in general but the system was built by bigoted people in a way that screws minorities since birth.

        link to this | view in chronology ]

  • icon
    That Anonymous Coward (profile), 24 Sep 2021 @ 3:56pm

    We took the same training we give our officers with all of the anecdotes and flat out racist teachings and put it in a computer... why is it as flawed?

    link to this | view in chronology ]

  • icon
    bhull242 (profile), 24 Sep 2021 @ 7:32pm

    New Report On Predictive Policing Shows How New Tech Is Giving Us Little More Than The Same Old Racism

    In other breaking news, water is wet, people need food, and racist assholes talk online.

    link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    icon
    restless94110 (profile), 25 Sep 2021 @ 11:07am

    Stating the obvious

    Uh, it's not racist to realize reality: blacks are much more violent than other races. AI and programming data quickly realize this. Idiots call that racist. Obviously it's just the truth. Machines don't lie. They ain't racist.

    Those that deny the reality of black violence, like you, are the true racists. You are anti-white racists, and I'm betting you are white, It's a mental illness in many white people.

    They call frank reporting and studies as racists because they show the truth, Time for you to get back to the science.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 26 Sep 2021 @ 9:39am

      Re: Stating the obvious

      Uh, it's not racist to realize reality: blacks are much more violent than other races. AI and programming data quickly realize this.

      If they are subjected to most of the police violence, they will appear more violent in police data sets, although they are the victims of violence and not the perpetrators.

      link to this | view in chronology ]

    • icon
      bhull242 (profile), 26 Sep 2021 @ 11:13am

      Re: Stating the obvious

      Uh, it's not racist to realize reality: blacks are much more violent than other races.

      I don’t think you know what “racism” means, because that statement right there is racist in itself, and it only gets worse in context.

      AI and programming data quickly realize this. Idiots call that racist. Obviously it's just the truth. Machines don't lie. They ain't racist.

      Here’s the thing: a machine can only work with the data you put into it. If the cops are plugging in data based on their racist views and unequal enforcement, the computer can’t just magically remove the racism from the data. Garbage in, garbage out.

      Those that deny the reality of black violence, like you, are the true racists.

      No one is saying black violence doesn’t happen. Only an idiot would say that. Saying that blacks are more likely to be violent after controlling for other factors like economic status is the problem here.

      You are anti-white racists, and I'm betting you are white, […]

      Ummmm… Do you not realize that that assumption about their skin color is also racist? Also, it’s pretty typical for bigots to say, “I’m not a bigot! You’re a bigot!” What are you, 12?

      It's a mental illness in many white people.

      And now you show you don’t know what mental illness is on top of not understanding what racism is. Are you going to go for the hat trick?

      They call frank reporting and studies as racists because they show the truth, […]

      Uh, no. For one thing, most studies don’t show that blacks are more likely to commit violence, so that’s just false. Most of the ones that did show such a connection either failed to consider confounding factors (like poverty), had small sample sizes or selection bias, or had other problems.

      Time for you to get back to the science.

      And we’ve got the hat trick! Yeah, based on the context, it’s clear that you don’t understand what science is, either.

      link to this | view in chronology ]

  • icon
    Coyne Tibbets (profile), 25 Sep 2021 @ 6:57pm

    If you put tomfoolery into a computer, nothing comes out of it but tomfoolery. But this tomfoolery, having passed through a very expensive machine, is somehow ennobled and no-one dares criticize it. -- Pierre Gallois

    Predictive Policing is nothing more than an attempt by police to use the above principle to hide their racism.

    link to this | view in chronology ]

  • icon
    Timothy8933 (profile), 26 Sep 2021 @ 9:52pm

    Reply

    For one thing, most studies don’t show that blacks are more likely to commit violence, so that’s just false. Most of the ones that did show such a connection either failed to consider confounding factors (like poverty), had small sample sizes or selection bias, or had other problems. https://www.upsers.onl/

    link to this | view in chronology ]

  • icon
    Lostinlodos (profile), 27 Sep 2021 @ 12:11pm

    Well….maybe?

    “listed 42 infants under the age of 1 as active gang members”

    Guess they start young?

    🤦‍♂️

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 28 Sep 2021 @ 5:02pm

      Re: Well….maybe?

      Some gangs are basically hereditary and act as an extended family such that odds are good that their children will wind up in it as well. Still a massively wrong approach "at risk" would be a far more accurate classification.

      link to this | view in chronology ]

      • icon
        Lostinlodos (profile), 28 Sep 2021 @ 5:30pm

        Re: Re: Well….maybe?

        Uh…I guess?
        I was just picking on the stupidity of the image of an 8mo firing an uzi from the crib. Or stroller.
        Whole new meaning to drive by.

        link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.