Unsecured Data Leak Shows Predicitive Policing Is Just Tech-Washed, Old School Biased Policing

from the hey-that-data-isn't-accurate-says-AI-company-confirming-data's-accuracy dept

Don't kid yourselves, techbros. Predictive policing is regular policing, only with confirmation bias built in. The only question for citizens is whether or not they want to pay tech companies millions to give them the same racist policing they've been dealing with since policing began.

Gizmodo (working with The Markup) was able to access predictive policing data stored on an unsecured server. The data they obtained reinforces everything that's been reported about this form of "smarter" policing, confirming its utility as a law enforcement echo chamber that allows cops to harass more minorities because that's always what they've done in the past.

Between 2018 and 2021, more than one in 33 U.S. residents were potentially subject to police patrol decisions directed by crime-prediction software called PredPol.

The company that makes it sent more than 5.9 million of these crime predictions to law enforcement agencies across the country—from California to Florida, Texas to New Jersey—and we found those reports on an unsecured server.

Gizmodo and The Markup analyzed them and found persistent patterns.

Residents of neighborhoods where PredPol suggested few patrols tended to be Whiter and more middle- to upper-income. Many of these areas went years without a single crime prediction.

By contrast, neighborhoods the software targeted for increased patrols were more likely to be home to Blacks, Latinos, and families that would qualify for the federal free and reduced lunch program.

Targeted more? In some cases, that's an understatement. Predictive policing algorithms compound existing problems. If cops patrolled neighborhoods mainly populated by minorities frequently in the past due to biased pre-predictive policing habits, the introduction of that data into the system returns "predictions" that "predict" more crime to be committed in areas where officers have most often been located historically.

The end result is what you see summarized above: non-white neighborhoods receive the most police attention, resulting in more data to feed to the machine, which results in more outputs that say cops should do the same thing they've been doing for decades more often. Run this feedback loop through enough iterations and it results in the continued infliction of misery on certain members of the population.

These communities weren’t just targeted more—in some cases, they were targeted relentlessly. Crimes were predicted every day, sometimes multiple times a day, sometimes in multiple locations in the same neighborhood: thousands upon thousands of crime predictions over years.

That's the aggregate. This is the personal cost.

Take the 111-unit Buena Vista low-income housing complex in Elgin. Six times as many Black people live in the neighborhood where Buena Vista is located than the city average.

Police made 121 arrests at the complex between Jan. 1, 2018, and Oct. 15, 2020, according to records provided by the city, many for domestic abuse, several for outstanding warrants, and some for minor offenses, including a handful for trespassing by people excluded from the complex.

Those incidents, along with 911 calls, fed the algorithm, according to Schuessler, the Elgin Police Department’s deputy chief.

As a result, PredPol’s software predicted that burglaries, vehicle crimes, robberies, and violent crimes would occur there every day, sometimes multiple times a day—2,900 crime predictions over 29 months.

That's not policing. That's oppression. Both law enforcement and a percentage of the general public still believe cops are capable of preventing crime, even though that has never been a feature of American law enforcement. PredPol software leans into this delusion, building on bad assumptions fueled by biased data to claim that data-based policing can convert police omnipresence into crime reduction. The reality is far more dire: residents in over-policed areas are confronted, detained, or rung up on bullshit charges with alarming frequency. And this data gets fed back into the software to generate more of the same abuse.

None of this seems to matter to law enforcement agencies paying for this software paid for with federal and local tax dollars. Only one law enforcement official -- Elgin (IL) PD's deputy police chief -- called the software "bias by proxy." For everyone else, it was law enforcement business as usual.

That also goes for the company supplying the software. PredPol -- perhaps recognizing some people might assume the "Pred" stands for "Predatory" -- rebranded to the much more banal "Geolitica" earlier this year. The logo swap doesn't change the underlying algorithms, which have accurately predicted biased policing will result in more biased policing.

When confronted with the alarming findings following Gizmodo's and The Markup's examination of Geolitica predictive policing data, the company's first move was to claim (hilariously) that data found on unsecured servers couldn't be trusted.

PredPol, which renamed itself Geolitica in March, criticized our analysis as based on reports “found on the internet.”

Finding an unsecured server with data isn't the same thing as finding someone's speculative YouTube video about police patrol habits. What makes this bizarre accusation about the supposed inherent untrustworthiness of the data truly laughable is Geolitica's follow-up:

But the company did not dispute the authenticity of the prediction reports, which we provided, acknowledging that they “appeared to be generated by PredPol.”

Geolitica says everything is good. Its customers aren't so sure. Gizmodo received responses from 13 of 38 departments listed in the data and most sent back written statements that they no longer used PredPol. That includes the Los Angeles Police Department, an early adopter that sent PredPol packing after discovering it was more effective at generating lawsuits and complaints from residents than actually predicting or preventing crime.

This report -- which is extremely detailed and well-worth reading in full -- shows PredPol is just another boondoggle, albeit one that's able to take away people's freedoms along with their tax dollars. Until someone's willing to build a system that doesn't consider all cop data to be created equally, so-called "smart" policing is just putting a shiny tech sheen on old-school cop work that relies on harassing minorities to generate biased busywork for police officers.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: biased policing, garbage in, garbage out, police tech, predictive policing, predpol
Companies: geolitica


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 27 Dec 2021 @ 4:48pm

    I was never kidding myself and biased policing is exactly what the voters want. Yes the police does protect and serve a bunch of racist middle-class assholes terrified of the poors coming to rape us and steal our PlayStation.

    Please not the PlayStation anything but the PlayStation.

    link to this | view in chronology ]

  • identicon
    Pixelation, 27 Dec 2021 @ 8:41pm

    Police say...

    "Hey, we're innocent, AI made us do it!" "Damn Ghost!"

    link to this | view in chronology ]

  • icon
    That Anonymous Coward (profile), 28 Dec 2021 @ 3:59am

    Huh...
    I thought that bad guys rob rich people cause they have more stuff.
    So this software was telling the police to leave the rich white folk on their own and exposed them to the evil evil poor folks (who weren't being hassled for "predictions") overrunning their unprotected neighborhoods!?!?

    link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    icon
    restless94110 (profile), 28 Dec 2021 @ 4:09am

    iMPARTIAL

    Since every single official government stat on violent crime reveals that blacks commit 5 to 7 times the violent crimes than whites or any other race, that would mean:

    1. AI would be correct in thinking blacks commit more crimes and should be policed accordingly (therefore AI is not wrong or whatever it is you are trying to say, which you should know anyway: AI can't be biased! AI isn't human. AI doesn't have emotions!)
    2. Therefore old school policing is not biased. At all. Period.
    3. Stereotypes are generated for a reason having nothing to do with bias and are extremely useful to humans. That's why we all use stereotypes all of the time in the process of life and survival.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 28 Dec 2021 @ 8:17pm

      Re:

      That's why we all use stereotypes all of the time in the process of life and survival.

      Is that tacit approval to everyone reporting your shitposting every time? That's just leveraging stereotypes of dumbass Jan 6th apologists like you.

      link to this | view in chronology ]

    • icon
      cattress (profile), 29 Dec 2021 @ 12:50am

      Re: iMPARTIAL

      First, if you only police black communities, you will mostly only catch black "suspects". That does not mean that factually, that these statistics represent reality. If police suddenly changed tactics, and instead aggressively patrolled predominantly white areas- where they still hassle black folks disproportionately but say the really, really white, and rich neighborhoods- and began feeding data about their arrests and interactions (and depending on how many Karen's, Kens, condo and home owner associations who flag them down to bitch about the car with the expired tag, the people who don't take their trash can up until after dark, or play that damn rap music while swimming in their own backyard pool) then the data would make those neighborhoods look like they need an even greater police presence. After all, that data is not based on actual convictions, charges that were pressed, or even verifiable victims of crime. The data is just what the police say is the complaint or crime, even if no victim is claiming it (and cops love some drug busts, and whites use drugs just as much as any other group)and how the cop dispositions the encounter, like someone doesn't pull over fast enough gets charged with evasion, someone speaks up about a rights violation and gets an interfering charge, someone puts their hand out to block a blow, or prevent their face from smacking the pavement gets resisting, maybe a battering a cop... It's all garbage in, garbage out. White people, as cops prove regularly, are plenty violent, they just get afforded more leniency and privacy (deny all you want, but cops are known for spousal and family abuse, and their sexual exploitation of victims and sex workers is always coming to light)
      Patrols should be limited to strategic locations in order to quickly respond to calls for help, and for dangerous driver situations like speeding in a school zone, driving aggressively or erratically. No more driving around looking for someone driving too cautiously, or teenagers in groups, or people minding their own business on a corner or front stoop or to fund a small town through traffic citation.

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    icon
    Koby (profile), 28 Dec 2021 @ 5:38am

    Racist Computer?

    Was the algorithm fed race data? How did it know?

    link to this | view in chronology ]

    • identicon
      David, 28 Dec 2021 @ 6:04am

      Re: Racist Computer?

      The algorithm was fed data. It made statistics from it. It predicted results from the statistics. That's entirely valid science. It's also prejudice. Acting on those statistics means that you judge people not on what they as a person have done but on what others in a similar group may or may not have done. Laws and policing are supposed to apply to individuals and provide individual encouragement and punishment.

      You can do prejudice in a sound scientific manner, but it remains prejudice. And the legal profession is supposed to enact justice, not prejudice. That means given different people the same kind of scrutiny, even though you may be more likely to find something when digging deeper when you let yourself be guided by prejudice.

      Now a worse form of prejudice, of course, are foregone conclusions, and then there are tendencies to treat the same kind of transgression differently depending on who you caught them with. This, for example, is done with "predictive sentencing" where you punish people based on what prejudice expects them to do rather than based on what they actually did.

      Doing policing based on statistics means that you cement those statistics by not recognising people who are outliers in their groups, for better or for worse. And for effective policing, you particularly would need to give positive and negative encouragement to the positive and negative outliers.

      Prejudiced policing is ineffective for improving society because it removes the incentive for individuals to improve.

      link to this | view in chronology ]

      • icon
        Koby (profile), 28 Dec 2021 @ 7:42am

        Re: Re: Racist Computer?

        The algorithm was fed data.

        But what kind of data? Was race included in the data? This doesn't answer the question.

        If a homeowner files a police report because of a break-in, the data isn't prejudiced. Patrols need to occur where residents are filing reports.

        link to this | view in chronology ]

        • icon
          nasch (profile), 28 Dec 2021 @ 7:53am

          Re: Re: Re: Racist Computer?

          Police patrol non-white areas more because they believe non-whites commit crimes more. Therefore they find more crimes there, because that's where they are. The AI is given this data, and predicts more crimes in non-white areas because that's where the police reports come from. Police then patrol those areas more because the AI told them to, generating more arrest records in those areas to feed back to the AI. And so on. Get it now?

          Now you might say that this is appropriate, because in actual fact lower income areas have more crime, regardless of whether there are any police there. The thing is, even if that's true more police there doesn't help. The crimes don't stop just because there are more police around to arrest people for them, so this whole program isn't helping anyone.

          link to this | view in chronology ]

          • icon
            Koby (profile), 28 Dec 2021 @ 10:12am

            Re: Re: Re: Re: Racist Computer?

            Police then patrol those areas more because the AI told them to, generating more arrest records in those areas to feed back to the AI. And so on. Get it now?

            No. It depends on many other factors, because arrest records are probably not the only data fed into such a system. For example, if police frequently patrol street #1, someone who gets carjacked on less frequently patrolled street #2 will still file a police report. An algorithm fed data from non-police initiated incidents would seem to be a very accurate and non prejudiced, and would also be a strong measure of effectiveness. If police patrols decrease the frequency or severity of non-police initiated reports, then they're in the right spot and it has nothing to do with bias. That's why I'm asking what kind of data this thing is being fed.

            link to this | view in chronology ]

        • identicon
          David, 28 Dec 2021 @ 9:09am

          Re: Re: Re: Racist Computer?

          But what kind of data? Was race included in the data? This doesn't answer the question.

          It doesn't matter because the data will crystallise around a socioeconomic background which in turn is used for treating people according to what other people "like them" do rather than themselves. There is more societal injustice than just race, but of course race is one of the easiest things to base prejudice on since it is usually readily visible and thus serves as a major rough category to "put people in place".

          That a computer also makes it possible to discriminate against white persons who hang out a lot with black people is not all that much of an improvement.

          link to this | view in chronology ]

        • identicon
          Michael, 28 Dec 2021 @ 3:45pm

          Re: Re: Re: Racist Computer?

          Irrelevant. All the software did was send cops to places with a historically high crime rate, making sure that the number of arrests in that area remained high while arrests in other areas remained low (since cops were busy policing "predicted" areas).

          Do you really think that spending millions of dollars on software to tell cops that high crime areas have higher crime rates is a good use of taxpayer dollars?

          Nothing at all was actually being predicted here. It was just money down the drain UNLESS you believe that stationing cops permanently in these (often black) neighborhoods at the expense of other neighborhoods is a good idea.

          link to this | view in chronology ]

  • identicon
    Anonymous Coward, 28 Dec 2021 @ 10:07am

    Predicitive Policing

    Unsecured Data Leak Shows Predicitive Policing Is Just Tech-Washed, Old School Biased Policing

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.