LAPD's Failed Predictive Policing Program The Latest COVID-19 Victim
from the [hastily-signs-DNR-certificate] dept
Fucking predictive policing/how the fuck does it work. Mostly, it doesn't. For the most part, predictive policing relies on garbage data generated by garbage cops, turning years of biased policing into "actionable intel" by laundering it through a bunch of proprietary algorithms.
More than half a decade ago, early-ish adopters were expressing skepticism about the tech's ability to suss out the next crime wave. For millions of dollars less, average cops could have pointed out hot crime spots on a map based on where they'd made arrests, while still coming nothing close to the reasonable suspicion needed to declare nearly everyone in a high crime area a criminal suspect.
The Los Angeles Police Department's history with the tech seems to indicate it should have dumped it years ago. The department has been using some form of the tech since 2007, but all it seems to be able to do is waste limited law enforcement resources to violate the rights of Los Angeles residents. The only explanations for the LAPD's continued use of this failed experiment are the sunk cost fallacy and its occasional use as a scapegoat for the department's biased policing.
Predictive policing is finally dead in Los Angeles. Activists didn't kill it. Neither did the LAPD's oversight. Logic did not finally prevail. For lack of a better phrase, it took an act of God {please see paragraph 97(b).2 for coverage limits} to kill a program that has produced little more than community distrust and civil rights lawsuits. Caroline Haskins has more details at BuzzFeed.
An LAPD memo dated April 15 quoted Police Chief Michel R. Moore saying that the police department would stop using the software, effective immediately, not because of concerns that activists have raised but because of financial constraints due to COVID-19, the disease caused by the novel coronavirus.
"The city's financial crisis, coupled with the impact of the COVID-19 pandemic, has resulted in the immediate freeze of new contractual agreements and 'belt-tightening' instructions by the Mayor to all city departments for all further expenditures," the memo said. "Therefore, the Department will immediately discontinue the use of PredPol and its associated reports."
Activists like Hamid Khan of the Stop LAPD Spying Coalition are calling this a win. And it is, sort of. When something you want stopped stops, it's still a victory, even if it appears to be due to unforeseeable developments rather than local activism. This doesn't mean Khan and others shouldn't keep working to keep LAPD's predpol system shut down. But it's perhaps too optimistic to declare this turn of events as a testament to your activism when it appears the LAPD is only temporarily mothballing a program its budget can't support at the moment.
But we can still hold out hope it won't be resurrected when the current crisis passes. The LAPD has struggled to show the program actually impacts criminal activity more than it does Constitutional rights, despite having more than a decade to do so. We can still celebrate its death, even if it's only being buried in effigy at this point. Maybe by the time this has all passed, the LAPD will realize it hasn't missed the expensive software's dubious contribution to the city's safety and abandon it for good.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: algorithms, covid-19, lapd, law enforcement, police, predictive policing
Reader Comments
Subscribe: RSS
View by: Time | Thread
Praise Objectivity
One of the common complaints is that police exhibit a bias, so if police work can be directed without that bias, then I would have to praise that kind of goal. I think the attempt to feed information into a computer system that could objectively make decisions is an admirable one. Unfortunately, the concept of pre-crime analysis is beyond current technological capabilities. The coronavirus is going to be used as a cheap excuse to sweep this one under the rug. Back away slowly, and hope noone will notice.
[ link to this | view in chronology ]
Re: Praise Objectivity
There is an ancient phrase (by computer industry standards at least): "Garbage In/Garbage Out". Because police are the people who decide what goes in to the predatory policing database, their bias goes in at the same time. One of the most insidious things about predatory policing is the fact that simply interacting with police increases your chances of the system recommending more interactions. These new interactions pile up in the system and ensure increasing police scrutiny because the predatory policing database has an ever increasing record of police interactions that tell the system you are a person of ongoing interest to police and should therefore be subjected to increasing harassment.
And it can all start with just a single instance of biased policing.
[ link to this | view in chronology ]
Re: Praise Objectivity
"I think the attempt to feed information into a computer system that could objectively make decisions is an admirable one."
But, its' one that will never work, because whoever feeds the data into the computer and those that program it all have their own biases, sometimes unconsciously.
The best analogy is the issues surrounding facial and voice recognition systems. There have been numerous complaints with both about how they treat minorities, either returning false positives in law enforcement use cases, or by having a higher error rate when dealing with personal systems. This is likely to be a implementation error than it is to be deliberate prejudice, but it's there because the people designing and testing systems are more likely to be white males and less likely to belong to the affected minority.
So, it's quite likely that any such "pre-crime" system will actually reinforce systemic prejudice rather than prevent it. Combined with the tendency for people to believe whatever a computer tells them over the evidence of their own eyes (see: idiots driving into rivers because the GPS told them there was a bridge there even though they can see there's not one there), it's a recipe for disaster without oversight - and the overseers are going to be the ones with the bias problem you're trying to solve.
[ link to this | view in chronology ]
garbage in == garbage out
[ link to this | view in chronology ]
Q: Want to really use predictive policing to determine where crimes will occur?
A: Just program in the officer's patrol route for the shift.
[ link to this | view in chronology ]
Too much Minority Report perhaps?
[ link to this | view in chronology ]
They should have named the software, "Probable Cause".
[ link to this | view in chronology ]
Cut the Police!
I guess we have the answer for how to get out rights back. Cut police funding until they stop.
[ link to this | view in chronology ]
What many police officers want is an objective, unbiased system that always agrees with them. Achieving one out of three is not bad, but it is hard to justify with the current economy.
[ link to this | view in chronology ]
Monkey See....
"Maybe by the time this has all passed, the LAPD will realize it hasn't missed the expensive software's dubious contribution to the city's safety and abandon it for good."
And we can also hope that the LAPD's coming experience without this albatross around their neck, may help other PD's around the globe decide to do likewise, and end their own similarly wasted expenses.
Hey. We can hope right.
[ link to this | view in chronology ]
It's Weird ...
In Area A, where many criminals live, work, and victimize people, actual police officers interact with actual criminals. The police officers in this area go from call to call to call. Police are extremely busy in these areas; this data into a computer system.
In Area B, where mostly law-abiding people live and work, police officers don't interact much with actual criminals. They spend most of their time on traffic violations and responding to minor calls like noise complaints. Police aren't very busy in these areas; this data goes into the computer system.
Why does anyone think it's a big mystery that the computer system tells police to spend time and resources in Area A? Why is this confusing? More to the point, why is it controversial ?
Obviously the computer system is a waste of taxpayer funds … because everyone in that city knows where the Area As and Area Bs are. The cops don't need an expensive computer system to tell them where crimes are happening and going to happen.
Now, if Cushing, Kahn, and most Techdirt commenters happen to notice a racial difference between the people in Area As and Area Bs, that's their problem, not the police department's. Reality is frequently uncomfortable.
[ link to this | view in chronology ]
hi
[ link to this | view in chronology ]