The Criminal Justice System Is Relying On Tech To Do Its Job And That's Just Going To Make Everything Worse
from the laying-the-groundwork-for-human-misery dept
The criminal justice system appears to be outsourcing a great deal of its work. On the law enforcement side, automatic license plate readers, facial recognition tech, and predictive policing have replaced beat cops walking the streets and patrolling the roads. Over on the judicial side, analytic software is helping make sentencing decisions. This is supposed to make the system better by removing bias and freeing up government personnel to handle more difficult duties algorithms can't handle.
As is the case with most things government, it works better in theory than in practice. ALPRs create massive databases of people's movements, accessible by a hundreds of law enforcement agencies subject to almost zero oversight. More is known about facial recognition's failures than its successes, due to inherent limitations that churn out false positives at an alarming rate. Predictive policing is the algorithmic generation of self-fulfilling prophecies, building on historical crime data to suggest future crimes will occur in high crime areas.
While the judicial side might seem more promising because it could prevent judges from acting on their biases when handing down sentences, the software can only offer guidance that can easily be ignored. That and the software introduces its own biases based on the data it's fed.
The logic for using such algorithmic tools is that if you can accurately predict criminal behavior, you can allocate resources accordingly, whether for rehabilitation or for prison sentences. In theory, it also reduces any bias influencing the process, because judges are making decisions on the basis of data-driven recommendations and not their gut.
You may have already spotted the problem. Modern-day risk assessment tools are often driven by algorithms trained on historical crime data.
As we’ve covered before, machine-learning algorithms use statistics to find patterns in data. So if you feed it historical crime data, it will pick out the patterns associated with crime. But those patterns are statistical correlations—nowhere near the same as causations. If an algorithm found, for example, that low income was correlated with high recidivism, it would leave you none the wiser about whether low income actually caused crime. But this is precisely what risk assessment tools do: they turn correlative insights into causal scoring mechanisms.
Correlation is not causation. Past performance is not indicative of future results. And an individual being sentenced is not the average of 20 years of historical crime data. The software may steer judges away from personal biases, but it creates new ones to replace them. It's a lateral "improvement" that does little more than swap the inputs.
Once you've got a system brought up to speed on garbage in, the biases multiply and perpetuate. Sentencing decisions based on biased data generate more bad data for the sentencing software… which then leads to successively harsher sentences for the same criminal act with each iteration. As the recursive data rolls in, the sentencing recommendations will justify themselves because who can argue with raw data?
This is not to say tech should not be used by the criminal justice system. It's that it needs to be subject to rigorous oversight and its employers made aware of its limitations, including its innate ability to reinforce biases, rather than remove them. This isn't something to be taken lightly. The lives and liberties of Americans are literally at stake. Taking a hands-off approach to tech deployment is highly irresponsible, and it indicates those in power care very little about what happens to the people they serve.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: algorithms, criminal justice, technology
Reader Comments
Subscribe: RSS
View by: Time | Thread
Bold of you to assume those in power think they serve the people instead of the other way around.
[ link to this | view in thread ]
No morality
FTFY. Who cares if peoples lives are screwed over in the process.
[ link to this | view in thread ]
Think of the judges
I mean having a guaranteed job where you get to make your own rules and cannot be fired sounds great but but it is a very taxing position.
Sure they have people to read everything for them and people to write up their opinions for them.
But actually forming an opinion that doesn't reveal their inner weakness is bloody hard.
Won't someone please think of the judges?
[ link to this | view in thread ]
[ link to this | view in thread ]
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
I'm reminded of...
The recent Reply All two-parter on the CompStat system, and its history from its inception, to when cops began relying on it, and actively being directed by it not to do their jobs properly, as the horrendous flaws became apparent.
https://www.gimletmedia.com/reply-all/127-the-crime-machine-part-i#episode-player
https://ww w.gimletmedia.com/reply-all/128-the-crime-machine-part-ii#episode-player
[ link to this | view in thread ]
Technology is great
You train the computers with statistics about prejudice and its societal consequences, and they will give you what you want to hear with science on top. It's quite more authoritive if a computer shouts "only a dead nigger is a good nigger" and "hang them all and you can't go wrong".
Predictive sentencing could pretend to make sense if jail time could be shown to prevent recidivism and jail could be considered a way to reform and reintegrate criminals: then it would be prevention. But punishing people in anticipation of what they might do makes no sense. Particularly not if jails are the entry card into more crime.
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: No morality
[ link to this | view in thread ]
Yet another example of moderation at scale...
Computers are *bad* at context.... so how is a computer gonna ask itself "is that right"???...when lives are on the line???...and the past history is full of wrongs!
And never mind the highly relevant question of who serves who!
[ link to this | view in thread ]
Shooter: Police Officer
Victim: Unarmed
Result: Death
Computer says: Indict!
[ link to this | view in thread ]
Good piece on Ars: Yes, “algorithms” can be biased. Here’s why
This, of course, shouldn't be any surprise to anyone who knows even a little bit about how computers work. Computers aren't magic; you can't give them biased data and get an unbiased result. Garbage in, garbage out.
[ link to this | view in thread ]
The other difference between algorithms and people is that algorithms are much better at being corrected.
Algorithms can learn something once and remember it throughout their "lifetime."
Algorithms don't die and lose all their accumulated knowledge without passing it on.
Algorithms don't have an ego that prevents them from accepting new and better information.
Algorithms are still new to the job, so it's rather unfair to compare them humans with decades of experience. They'll get better over time, and they'll get better better than humans do.
[ link to this | view in thread ]
Re:
An easy example is Body Mass Index. It is useful for statistical analysis populations, but virtually worthless as a serious measure of individual health. The allure of it, like a sentencing algorithm, is that it's easy.
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
Re: Technology is great
"You train the computers with statistics about prejudice and its societal consequences, and they will give you what you want to hear with science on top."
I'll see you and raise. The very second the algorithms and code used leaks what we have is suddenly a paradigm where organizaed criminals will know, to a T, where law enforcement is unlikely to be stationed and ready.
I can, all too easily, envision a new type of criminal who specializes in selling this type of demographic map detailing where the boys in blue are least likely to be present, to any smart burglar with a bitcoin wallet.
[ link to this | view in thread ]
Re:
"This, of course, shouldn't be any surprise to anyone who knows even a little bit about how computers work. "
Unfortunately politicians have a very long tradition of not understanding cause and consequence.
[ link to this | view in thread ]