Proprietary Algorithms Are Being Used To Enhance Criminal Sentences And Preventing Defendants From Challenging Them
from the like-the-Coke-formula,-but-for-years-of-someone's-life dept
When law enforcement agencies want to know what people are up to, they no longer have to send officers out to walk a beat. It can all be done in-house, using as many data points as can be collected without a warrant. Multiple companies offer "pre-crime" databases for determining criminal activity "hot spots," which allow officers to make foregone conclusions based on what someone might do, rather than what they've actually done.
Not that's it doing much good. For all the time, money, and effort being put into it, the databases seem to be of little utility.
Many law enforcement agencies use software to predict potential crime hot spots, and the police in Kansas City, Mo., and other places have used data to identify potential criminals and to try to intervene.
[...]
In Chicago, where there has been a sharp rise in violent crime this year, the police have used an algorithm to compile a list of people most likely to shoot or be shot. Over Memorial Day weekend, when 64 people were shot in Chicago, the police said 50 of the victims were on that list.
So much for "intervention." Having a list of people who have a higher risk of being shot doesn't mean much when all it's used for is confirming the database's hunches. However, these same databases are being put to use in a much more functional way: determining sentence lengths for the criminals who have been arrested.
When Eric L. Loomis was sentenced for eluding the police in La Crosse, Wis., the judge told him he presented a “high risk” to the community and handed down a six-year prison term.
The judge said he had arrived at his sentencing decision in part because of Mr. Loomis’s rating on the Compas assessment, a secret algorithm used in the Wisconsin justice system to calculate the likelihood that someone will commit another crime.
We're locking up more people for more years based on criminal activity they'll no longer have the option of possibly performing. This is nothing new. Sentencing enhancement is based on a lot of factors, not all of them confined to proprietary databases. But what is new are the algorithms used to determine these sentence enhancements, most of which belong to private companies who are completely uninterested in sharing this crucial part of the equation with the public.
In Mr. Loomis' case, the software determined he would be likely to engage in further criminal activity in the future. A so-called "Compas score" -- provided by Northpointe Inc. -- resulted in a six-year sentence for eluding an officer and operating a vehicle without the owner's consent. His lawyer is challenging this sentence enhancement and going after Northpointe, which refuses to release any information about how the Compas score is compiled.
What Northpointe has released are statements that confirm the code is proprietary and that the Compas score is "backed by research" -- although it is similarly unwilling to release this research.
The problem here isn't so much the use of algorithms to determine sentence lengths. After all, state and federal guidelines for sentence lengths are used all of the time during sentencing, which includes factors such as the likelihood of future criminal activity. But these guidelines can be viewed by the public and are much more easily challenged in court.
The use of private contractors to provide input on sentencing renders the process opaque. Defendants can't adequately challenge sentence enhancements without knowing the details of the "score" being presented by prosecutors to judges. The algorithms' inner workings should either be made available to defendants upon request, or the "score" should be determined solely by government agencies, where the data and determining factors can be inspected by the public.
We're now in the unfortunate situation where companies are telling judges how long someone should be locked up -- using data which itself might be highly questionable. The feeling seems to be that if enough data is gathered, good things will happen. But as we can see from Chicago's implementation of this technology, the only thing it's done so far is add confirmation bias toetags to the ever-increasing number of bodies in the city's morgues.
The use of locked-down, proprietary code in sentencing is more of the same. It undermines the government's assertion that prison sentences are a form of rehabilitation and replaces it with the promise that criminal defendants will "do the time" so they can't "do the crime" -- all the while preventing those affected from challenging this determination.
Filed Under: compas assessment, crime, predictions, proprietary algorithms, repeat offenders, sentencing, thought crimes
Companies: compas