LAPD Blames Predictive Software For Misconduct And Abuse, Rather Than Its Own Disinterest In Holding Officers Accountable

from the it's-never-a-cop's-fault dept

As long as we're heading into an age of predictive policing, it's good to know that some police departments are willing to turn the ThoughtCrime scanner on their own employees.

Police departments across the U.S. are using technology to try to identify problem officers before their misbehavior harms innocent people, embarrasses their employer, or invites a costly lawsuit — from citizens or the federal government.
Of course, some of this is just "insider threat" detection that ousts whistleblowers before they can blow the whistle and punishes employees for not adhering to the prevailing mindset. Nothing about this software is anywhere close to perfect, but it's still being used to (hopefully) head off police misconduct before it occurs. But what the system flags doesn't seem to be stopping cops before they do something regrettable.
The systems track factors such as how often officers are involved in shootings, get complaints, use sick days and get into car accidents. When officers hit a specific threshold, they're supposed to be flagged and supervisors notified so appropriate training or counseling can be assigned.
Proponents of the system point out that its largest value is as a deterrent. Even so, it's still relatively worthless.
The Los Angeles Police Department agreed to set up their $33 million early warning systems after the so-called Rampart scandal in which an elite anti-gang unit was found to have beaten and framed suspected gang members. The system was then implemented in 2007.

The LAPD's inspector general found in a recent review that the system was seemingly ineffective in identifying officers who ultimately were fired. The report looked at 748 "alerts" over a four-month period and found the agency took little action in the majority of cases and only required training for 1.3 percent, or 10 alerts, of them.
The LAPD presents this as a software failure -- and some of it is. What's being flagged isn't necessarily indicative of potential misconduct. But beyond the algorithm, there's this integral part which is being ignored.
Experts say the early warning system can be another powerful tool to help officers do their jobs and improve relations, but it is only as good as the people and departments using it… "These systems are designed to give you a forewarning of problems and then you have to do something."
Even the IG's report notices nothing's being done. 748 "alerts" only resulted in action on 10 of them. The LAPD is trying to portray this as a software failure, most likely in hopes of ditching the system that was forced on it by its own bad behavior. (The irony here is that police departments will argue that predictive policing software doesn't work on cops but does work on citizens.)

But it's not just the software. It's the LAPD. Long before the Rampart Scandal of the late 90s uncovered massive corruption in the force, the LAPD's Internal Affairs department was doing absolutely nothing to hold officers accountable for misconduct.
The Christopher Commission (1991) in Los Angeles found that the Internal Affairs Division (IAD) of the LAPD had sustained only 2 percent of the excessive force complaints and stated: "Our study indicates that there are significant problems with the initiation, investigation, and classification of complaints." It called the IAD investigations "unfairly skewed against the complainant."
More recent reports [pdf] still show that public complaints are almost never sustained (3.5%). Even factoring in the much higher rate given to complaints from other officials and officers (45%), the overall rate still routinely sits near 10%.

This isn't just Los Angeles. Overall, the nation's law enforcement agencies are only sustaining 8% of complaints. Officers have seemingly unlimited "strikes" before misconduct costs them their jobs. Combine that with the low sustain rate and officers know they can get away with a lot before they receive any discipline.

And if that isn't enough, these flagging systems create their own perverse incentives:
A 2011 Justice Department report found the New Orleans Police Department's system, adopted roughly two decades ago, was "outdated and essentially exists in name only." Investigators said information was included haphazardly and flagged officers were put into essentially "bad boy school," a one-size-fits-all class seen by some as a badge of honor.
No doubt more than a few New Orleans residents received a few extra nightstick swings/Taser shots just so Officer X could hang with the big boys. Fun stuff.

But on the other side of the coin lies the LA Sheriff's Department -- at least in terms of predictive software.
The sheriff's department has an early warning system. "Our diagnostic systems were fine," said the department's Chief of Detectives, Bill McSweeney, who advised his agency on creation of the warning system. "Our managerial and supervision response was not fine. It's that simple."
The LASD is finally acknowledging that it let its officers (and prison guards) act with impunity for far too many years. The system could have worked -- at least in its limited capabilities -- but no one wanted to follow up on flagged officers. The situation there has deteriorated to the point that the LASD is looking at a few years of federal supervision.

Predictive policing is still a bad idea, even for policing police. While data may help pinpoint problem areas, the flagging systems are far too inaccurate to guarantee hits. But the problem within law enforcement agencies is the lack of accountability, not faulty software. Unless the first problem is addressed, it won't matter how much the software improves in the future.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: abuse, accountability, lapd, police abuse, predictive software


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    That One Guy (profile), 9 Sep 2014 @ 7:56pm

    Using another technology as an example:

    There's no point in having a fire alarm, if you don't pay any attention to it when it goes off.

    link to this | view in thread ]

  2. icon
    ysth (profile), 10 Sep 2014 @ 12:06am

    Possible use here...

    While I agree that predictive policing is a bad idea, perhaps it could be used here simply to prioritize who gets body cams first while these departments spread the necessary purchases to achieve 100% coverage out over, say, 3 years.

    Like that would happen.

    link to this | view in thread ]

  3. identicon
    Pedant, 10 Sep 2014 @ 12:45am

    "Lack of interest" not "Disinterest"

    'Disinterest' suggests strictly held impartiality. Not, I think, what you were trying to imply here.

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 10 Sep 2014 @ 4:59am

    anything available that can be used as a way out will be used! the truth of the matter is, there should be no need for a way out to begin with!!

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 10 Sep 2014 @ 6:33am

    I think adding sensors to their equipment would make the difference embed chips into all their tools Firearms , nightsticks, tasers and such as well as body cams and motion sensors they could record actual movements at all times insert them in their badge shoes and un-intrusive wrist bands.

    link to this | view in thread ]

  6. icon
    Uriel-238 (profile), 10 Sep 2014 @ 7:03am

    ...Or even if they let things get worse...

    Well, so long as they're trying, even if they don't improve the situation and cops continue to gun down innocent people and use tasers and pepper spray in lieu of talking to citizens, this remains a tolerable situation.

    Right?

    link to this | view in thread ]

  7. identicon
    GMont, 10 Sep 2014 @ 12:41pm

    33 million dollars!

    33 million dollars!!!???!!!

    For software that collects bad behaviour reports!!!!!!!
    Bad Behaviour Reports that are then simply ignored by the cops anyway.

    Next time, get a junior high school kid to write the software for them. It won't cost a fortune, and it'll likely work way better and if the cops ignore the results, at least the tax payer is not out 33 million for the effort.

    Boy talk about milking the tax payer.

    link to this | view in thread ]

  8. icon
    John Fenderson (profile), 10 Sep 2014 @ 12:52pm

    Re:

    Sometimes I wish I had no sense of ethics at all. I'd be a millionaire.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.