AI Isn't Making The Criminal Justice System Any Smarter

from the BIAS-2.0 dept

We've covered the increasing reliance on tech to replace human judgment in the criminal justice system, and the news just keeps getting worse. The job of sentencing is being turned over to software owned by private contractors, which puts a non-governmental party between defendants and challenges to sentence length.

The systems being used haven't been rigorously tested and are as prone to bias as the humans they're replacing. The system used by Washington, DC courts to sentence juvenile defendants hasn't been examined ever, and yet it's still being used to determine how long a person's freedom should be taken away.

This system had been in place for 14 years before anyone challenged it. Defense lawyers found nothing that explained the court's confidence in using it to sentence juveniles.

[I]n this particular case, the defense attorneys were able to get access to the questions used to administer the risk assessment as well as the methods of administering it. When they dug into the validity behind the system, they found only two studies of its efficacy, neither of which made the case for the system’s validity; one was 20 years old and the other was an unreviewed, unpublished Master’s thesis. The long-held assumption that the system had been rigorously validated turned out to be untrue, even though many lives were shaped due to its unproven determination of ‘risk’.

One system used in courts all over the nation is developed by Equivant (formerly Northpointe). It's called COMPAS (Correctional Offender Management Profiling for Alternative Sanctions). COMPAS uses a set of questions to determine how much of the book is thrown at defendants, using data that only makes the United States' carceral habits worse.

Northpointe’s core product is a set of scores derived from 137 questions that are either answered by defendants or pulled from criminal records. Race is not one of the questions. The survey asks defendants such things as: “Was one of your parents ever sent to jail or prison?” “How many of your friends/acquaintances are taking drugs illegally?” and “How often did you get in fights while at school?” The questionnaire also asks people to agree or disagree with statements such as “A hungry person has a right to steal” and “If people make me angry or lose my temper, I can be dangerous.”

The US locks up an alarming number of people every year and an alarming percentage of them are black. Feed this data into a system that wants to see if it's locking up enough black people and the data will tell judges to keep hitting black people with longer sentences. It's a feedback loop no one can escape from. Every new sentence using these calculations only adds more data telling the system it's "right."

Not only is the "improved" system introducing its own algorithmic biases, its proprietary biases are no better than those it's replacing. This is how the system has been proven wrong repeatedly. It spits out lower recidivism risk scores for white defendants, only to have those defendants commit more crimes in the future than their black counterparts -- even when black people arrested for the same criminal activity have been given considerably higher risk scores by COMPAS.

That's not the only problem. Since it's privately-owned, defense lawyers and researchers have been unable to examine the software itself. You may be able to challenge it based on sentencing data (if you can even manage to get that), but you won't be able to attack the software itself because it wasn't developed by the government.

Equivant doesn’t have to share its proprietary technology with the court. “The company that makes COMPAS has decided to seal some of the details of their algorithm, and you don’t know exactly how those scores are computed,” says Sharad Goel, a computer-science professor at Stanford University who researches criminal-sentencing tools. The result is something Kafkaesque: a jurisprudential system that doesn’t have to explain itself.

The new way gives us the same results as the old way. But it can't be examined. It can only be questioned, and that's not really getting anyone anywhere. A few sentences have been challenged, but every day it's in use, COMPAS keeps generating sentences for "risky" defendants. And these sentences go right back into the database, confirming the software's biases.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: ai, compas, criminal justice
Companies: equivant, northpointe


Reader Comments

The First Word

Study cited in the article has been debunked

The study you are citing (Angwin et.al. from ProPublica) has been debunked. No bias could be shown to exist in the Northpointe software.

Source 1: "Algorithms in the Justice System: Some Statistical Issues", Royal Statistical Society (November 08, 2018) which calls the ProPublica study "ill-founded".
Source 2: Flores, Bechtel, Lowencamp; Federal Probation Journal, September 2016, "False Positives, False Negatives, and False Analyses: A Rejoinder to “Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And it’s Biased Against Blacks.”", URL http://www.uscourts.gov/statistics-reports/publications/federal-probation-journal/federal-probation- journal-september-2016
In fact the ProPublica analysis was so poorly done (bad sampling, bad analysis, basic statistical mistakes) that the authors wrote: "It is noteworthy that the ProPublica code of ethics advises investigative journalists that "when in doubt, ask" numerous times. We feel that Larson et al.'s (2016) omissions and mistakes could have been avoided had they just asked. Perhaps they might have even asked...a criminologist? We certainly respect the mission of ProPublica, which is to "practice and promote investigative journalism in the public interest." However, we also feel that the journalists at ProPublica strayed from their own code of ethics in that they did not present the facts accurately, their presentation of the existing literature was incomplete, and they failed to "ask." While we aren’t inferring that they had an agenda in writing their story, we believe that they are better equipped to report the research news, rather than attempt to make the research news."

That said people have tried fixing the criminal justice system for years. Sentencing guidelines, sensitivity training, whatever... When humans made the decisions the outcome apparently wasn't much different and without bias. The algorithmic approaches have the advantage that they are consistent and (in theory) could be verifiable (how else do you get a large enough sample for each judge to test they are not biased). We can discuss the particular implementation (maybe COMPAS isn't the best to use), but I do think using algorithms that optimize for a pre-determined outcome (say, no new offense once released) can lead to be a better way of handling sentencing.

—Mark

Subscribe: RSS

View by: Time | Thread


  1. icon
    Anonymous Anonymous Coward (profile), 28 Jun 2019 @ 7:31pm

    Sign here, no need to read the fine print.

    I'd like to see the contract signed by the courts when they 'purchase' this software. Purchase is in quotes because it might be sold as a service and they, like others don't actually own what they buy.

    Somehow I think that the sales contract benefits the seller and prevents any kind of being able to qualify the product. That the courts, made up of lawyers (judges are lawyers...right?) didn't and now don't question the validity of the software seems somehow more political than judicial or judicious. Is this what we get from electing judges or are appointed judges just as political.

    That a system of justice can allow software that cannot be proven to provide justice is considered to be just is an anathema. How is it that these questions have not reached a higher court and the entire concept shot down?

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 28 Jun 2019 @ 8:23pm

    Prisoners, modern slaves

    Don't forget that many prison systems are using the people there as slave labor, creating everything from your underwear to armor. It doesn't matter if you are innocent of the crimes, or mentally incapable of being responsible for your actions, you are in their power and you will do as they command or be punished with extra time added to your sentence. People are profiting off of the children and adults forced into losing their rights. Programs like this are just making the corrupt get more power.

    link to this | view in thread ]

  3. This comment has been flagged by the community. Click here to show it
    identicon
    Harold Fck the only thing missing is 'osdi', 28 Jun 2019 @ 8:58pm

    Not really Artificial Intelligence. Nor "carceral" a word.

    You're engrandizing simple "computer dating" type software and making up new words out of the blue as go, quite oddly mimicking the flaws of this "system".

    link to this | view in thread ]

  4. This comment has been flagged by the community. Click here to show it
    identicon
    Harold Fck the only thing missing is 'osdi', 28 Jun 2019 @ 8:59pm

    Re: Not really Artificial Intelligence. Nor "carceral"

    Now, even if I share all your reserves, you must have some data that suggests intrinsic unfairness AND have some workable alternative before can really criticize.

    Would you have the prior highly NON-uniform "system" of judges deciding, with personal spleen or prejudice, irritation from sitting on hemorrhoids all day, and so on? What's your far better alternative, sonny?

    Otherwise, this is just innuendo and lawyering (not necessarily inappropriate for particular cases when a big number turns up).

    Any unfairness resulting is part of larger societal problems with ignoring petty crime and it constantly growing, evident daily right here on Techdirt, that some routinely advocate violating Rights of persons by stealing copyrighted works. As I've long said, anyone who pirates lousy entertainments has slipped the bounds of civilized society and heads toward a career of crime (those already far down that slippery slope will of course disagree); the longer without being caught, the further in excusing: making up entitlement, "not harming anyone", "natural right to copy", blaming the creators for not having a magical "new business model" which is immune to rampant theft, casting creators as Evil, and so on.

    =====

    Again required to break in two before accepted by the mighty "filter". Don't blame me!

    link to this | view in thread ]

  5. identicon
    Pixelation, 28 Jun 2019 @ 9:11pm

    Stupid is as stupid uses.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 28 Jun 2019 @ 11:53pm

    Pretextual Prejudice as the Goal

    Really the use of "AI" with sentencing has always seemed to not towards rehabilitation or fairness but as a pretext to smuggle in their prejudices. best illustrated by the UK algorithm which inexplicably took owned garden size (yard in the US sense) as a sentencing parameter. There is literally no reason to include that other than classism or tribal bias.

    If they thought that yard work would be the crucial ingredient to rehabilitation they would have already included it in their prison system.

    link to this | view in thread ]

  7. icon
    That Anonymous Coward (profile), 29 Jun 2019 @ 1:02am

    Everyone is worshiping at the altar of AI, it will be better, smarter, faster....
    I just have one question...
    What perfect person are they modeling the training from?

    AI does away with all of the bias and flaws!!!
    Except for those left in place by well meaning people who still can't imagine themselves in someone elses shoes.

    The AI is taught the 'common wisdom' that we all know...
    Priests are good people - Uhhh

    Gays are pedophiles - Nope, but the louder the family values guy screams it the more kids hes diddling.

    Black people rob everyone - This has absolutely nothing to do with laws that specifically target them, that a white kid & a black kid doing the exact same things have drastically different outcomes.

    There is no way to get the bias out of AI, we are tragically flawed teachers who can't see the trees for the forest.

    We can't teach morals (points at the flaming failures of the moral majority & others) because we can't follow them ourselves.

    KILLING A CHILD IS HORRIBLE, EVERY LIFE IS SACRED!!!

    EXECUTE THE PRISONER!!!!

    uhhh what?

    We want to pretend we have compassion for bad childhoods, broken homes, etc etc... but we really don't. You can feel for the teen who was sexually abused, but you are then afraid to send the wrong message if you don't sentence them to death for killing their abuser.

    The best use of AI in the legal system would be making sure charges aren't stacked to the sky to force a plea & to actually push prosecutors to do their jobs when it is clear the law was broken, even if it means some Union head screams at you on tv.

    Another good use would be reviewing sentences, we can give it recidivism rates & all the data on people who were convicted of the same crimes & their sentences.

    Hell AI could even look at QI cases and wonder if the Judges are brain damaged. AI could show us the flaws in cases with the same crimes but the different outcomes, when justice is supposed to be blind but clearly is peeking.

    I would be more interest in seeing AI examine a history of cases & ask questions why the outcomes were different.
    AI doesn't care if the shooter was a cop, the AI cares that an armed person forced their way into a home & killed the person who lived there & would remind us what the charges should be, and the union head can scream all they want... but its a computer applying the law evenly in every case without worrying about blue flu & them throwing other cases.

    link to this | view in thread ]

  8. identicon
    David, 29 Jun 2019 @ 2:25am

    So what are the alternatives?

    How else are you going to justify different legal standards for people of different social class? The U.S. legal system allows rich persons to have poorer persons thrown in jail or ruin their lives financially by virtue of everybody having to pay their own legal costs proportionate to the legal competence of your representation, meaning that the financially underrepresented party will have to take a plea deal.

    Without sentencing adjusted to the severity of one's social standing, defendants may be tempted to scratch together the money for a competent legal defense and forego the plea deal without being suitably punished for their insubservience to the system.

    This double standard allows for the peaceful coexistence of superior and inferior races. You don't want to revert to the state where you have to kill all the Red Indians lest they ask for their land back or lynch the blacks because they don't know their place in society and consider themselves entitled to equal rights as if.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 29 Jun 2019 @ 4:00am

    Re:

    When this program exists, I personally want to call it Dredd.

    link to this | view in thread ]

  10. identicon
    Anonymous Cowherd, 29 Jun 2019 @ 4:09am

    Nobody would consider an unknown person sitting in a black box putting out "scores" based on unknown criteria a credible source of sentencing recommendations. Strange that replacing the person with an equally unknown machine makes all the difference.

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 29 Jun 2019 @ 4:38am

    Re: Re: Not really Artificial Intelligence. Nor "carceral&q

    Would you have the prior highly NON-uniform "system" of judges deciding, with personal spleen or prejudice, irritation from sitting on hemorrhoids all day, and so on?

    Than a black-box program? Absolutely. Judges obviously have their own flaws, some more than others. But replacing them with a secret formula is a step down, not a step up.

    link to this | view in thread ]

  12. identicon
    Anonymous Coward, 29 Jun 2019 @ 6:49am

    Re: Prisoners, modern slaves

    It seems that is their plan on competing with china

    link to this | view in thread ]

  13. identicon
    Anonymous Coward, 29 Jun 2019 @ 6:51am

    I suspect that if one were to investigate business ties they would find there is a connection between this software and the private prison industry.

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 29 Jun 2019 @ 6:55am

    Re:

    Judges are not kept in black boxes, they do not create scores and many times judgments are accompanied by detailed rational for the sentence.

    Strange that your silly analogy sucks.

    link to this | view in thread ]

  15. identicon
    TFG, 29 Jun 2019 @ 7:07am

    Re: Re:

    I believe you have misinterpreted the analogy. The analogy does not compare Judges to unknown people in a black box.

    Rather, the analogy takes the concept of a black box that spits out numbers, which is what we have now in a software format, and removes the software to replace it with a human element.

    Society would not accept some unknown person, above all forms of redress, handing out sentences, but for some reason software is fine. That's the point of the analogy.

    link to this | view in thread ]

  16. icon
    NoahVail (profile), 29 Jun 2019 @ 7:13am

    Re: Re: Not really Artificial Intelligence. Nor "carceral&q

    Nor "carceral" a word.

    Is your false assertion here bait?
    Asking because the alternative is you making-up crap out of the blue.

    link to this | view in thread ]

  17. identicon
    Rocky, 29 Jun 2019 @ 7:38am

    Re: Not really Artificial Intelligence. Nor "carceral" a word.

    What? Isn't "carceral" a word?

    Merriam-Webster disagrees with you:

    Definition of carceral
    : of, relating to, or suggesting a jail or prison

    As usual, your belief in your own knowledge is misplaced.

    link to this | view in thread ]

  18. icon
    Gary (profile), 29 Jun 2019 @ 7:51am

    Re: Not really a word.

    making up new words out of the blue as go, quite oddly mimicking the flaws of this "system".

    Wow. This must be a spoof - not even Blue Balls would say something so stunningly ignorant and ironic. A1 pot black kettle shit there.

    link to this | view in thread ]

  19. identicon
    Anonymous Coward, 29 Jun 2019 @ 8:31am

    Re: Re: Re:

    It is still a bad analogy

    link to this | view in thread ]

  20. identicon
    Anonymous Coward, 29 Jun 2019 @ 11:22am

    Re: Re: Not really Artificial Intelligence. Nor "carcer

    "you must have some data that suggests intrinsic unfairness AND have some workable alternative before can really criticize"

    Bullshit

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 29 Jun 2019 @ 11:23am

    Re: Re: Not really Artificial Intelligence. Nor "carcer

    When laws are stupid and make no sense the populace tends to ignore them. Go figure.

    link to this | view in thread ]

  22. identicon
    David, 29 Jun 2019 @ 12:22pm

    Re: Re: Re: Re:

    Well, the whole point of AI-based sentencing is to make bad analogies in order to punish people for the statistics of their peers rather than their individual actions. So making bad analogies about bad analogies seems par for the course.

    link to this | view in thread ]

  23. identicon
    Mark, 29 Jun 2019 @ 6:50pm

    Study cited in the article has been debunked

    The study you are citing (Angwin et.al. from ProPublica) has been debunked. No bias could be shown to exist in the Northpointe software.

    Source 1: "Algorithms in the Justice System: Some Statistical Issues", Royal Statistical Society (November 08, 2018) which calls the ProPublica study "ill-founded".
    Source 2: Flores, Bechtel, Lowencamp; Federal Probation Journal, September 2016, "False Positives, False Negatives, and False Analyses: A Rejoinder to “Machine Bias: There’s Software Used Across the Country to Predict Future Criminals. And it’s Biased Against Blacks.”", URL http://www.uscourts.gov/statistics-reports/publications/federal-probation-journal/federal-probation- journal-september-2016
    In fact the ProPublica analysis was so poorly done (bad sampling, bad analysis, basic statistical mistakes) that the authors wrote: "It is noteworthy that the ProPublica code of ethics advises investigative journalists that "when in doubt, ask" numerous times. We feel that Larson et al.'s (2016) omissions and mistakes could have been avoided had they just asked. Perhaps they might have even asked...a criminologist? We certainly respect the mission of ProPublica, which is to "practice and promote investigative journalism in the public interest." However, we also feel that the journalists at ProPublica strayed from their own code of ethics in that they did not present the facts accurately, their presentation of the existing literature was incomplete, and they failed to "ask." While we aren’t inferring that they had an agenda in writing their story, we believe that they are better equipped to report the research news, rather than attempt to make the research news."

    That said people have tried fixing the criminal justice system for years. Sentencing guidelines, sensitivity training, whatever... When humans made the decisions the outcome apparently wasn't much different and without bias. The algorithmic approaches have the advantage that they are consistent and (in theory) could be verifiable (how else do you get a large enough sample for each judge to test they are not biased). We can discuss the particular implementation (maybe COMPAS isn't the best to use), but I do think using algorithms that optimize for a pre-determined outcome (say, no new offense once released) can lead to be a better way of handling sentencing.

    link to this | view in thread ]

  24. icon
    discordian_eris (profile), 30 Jun 2019 @ 2:42am

    Same Old Problem, from the usual suspects

    Garbage In, Garbage Out

    link to this | view in thread ]

  25. identicon
    Anonymous Coward, 30 Jun 2019 @ 6:16am

    Re: Study cited in the article has been debunked

    The attempt to replace judges with software is an ill fated venture.
    What's next ... Milt screaming "Software is people my friend"?

    Does this apply to serious cases? Probably not. It will probably be used in conjunction with some AI public defender software designed to make plea bargains run more efficiently thus providing for a full prison system enabling bonuses and dividends for a few careless individuals.

    This will not bring any consistency to the multi tiered justice system where wealth equals immunity.

    link to this | view in thread ]

  26. icon
    Beta (profile), 30 Jun 2019 @ 9:26am

    [i]"The US locks up an alarming number of people every year and an alarming percentage of them are black. Feed this data into a system that wants to see if it's locking up enough black people and the data will tell judges to keep hitting black people with longer sentences."[/i]

    Umm... how do you know that? How do you know how "the system" will respond to such data? If you haven't seen the code, you don't know which way it'll jump.

    link to this | view in thread ]

  27. identicon
    Anonymous Coward, 30 Jun 2019 @ 9:36am

    Re:

    I appreciate your attempt at unbiased evaluation, however - history provides some guidance in the effort to prepare for future events.

    What makes you think the powers presently in charge would allow unbiased sentencing?

    If the AI is one that is supposed to learn ... what will it learn from all the past sentencing that it is privy to?

    link to this | view in thread ]

  28. identicon
    Anonymous Coward, 1 Jul 2019 @ 8:01am

    Re: Study cited in the article has been debunked

    Even if the black box isn't spitting out biased results at this point in time -- its inability to explain itself and the lack of accountability for any problems with it that might crop up in the future are both problems in their own right. If we are going to have a Sentence-O-Matic roaming around, it better darn well be something that is at least as transparent and accountable as the rest of our justice system...

    link to this | view in thread ]

  29. icon
    Thad (profile), 1 Jul 2019 @ 8:23am

    Re: Study cited in the article has been debunked

    Well if an unbiased source like *checks URL* uscourts.gov says the study of the criminal justice system is wrong, that's good enough for me.

    link to this | view in thread ]

  30. identicon
    Anonymous Coward, 2 Jul 2019 @ 10:23am

    Who wrote the software? What do you think it does?

    So if you were a Private Corporation that was able to charge whatever you wanted for services ($10.00/Minute for phone calls, $5.00 for a candy bar, etc), how would you design software for use in 'filling' your for profit prison?

    1. If bed available, release not advised at this time
    2. If no bed available, release not advised at this time, request extra funding for increase in available beds.
    3. Go To 1
    4. If made it to here, advise release at this time.

    PROFIT ALL THE WAY TO THE BANK BABY... I'z an xclnt coder.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.