State Supreme Court Says Secret Software Used In Sentencing Determinations Not A Violation Of Due Process Rights

from the not-as-long-as-it's-used-perfectly-within-an-impossible-set-of-confines dept

An algorithm is deciding certain criminal defendants should spend more time in prison. And that determination can't be fully challenged because the code belongs to a private company which provides the software to the government.

Eric Loomis was determined to be a "high risk" defendant, based on something called a "COMPAS score." COMPAS -- Criminal Offender Management Profiling for Alternative Sanctions -- cranks out Presentence Investigation Reports for use in the courtroom, utilizing a number of factors to generate a score that lets judges know how likely the defendant is to re-offend.

The problems with this system are numerous. For one, the code is proprietary, so defendants aren't allowed to examine the factors that lead to this determination, unlike other sentencing guidelines created by the government, which are open to the public to examine.

Another problem is that the algorithm engages in demographic profiling -- generally considered to be a bad thing when it comes to determining criminal behavior.

Back in May ProPublica published an investigation into the risk-assessment software that found that the algorithms were racially biased. ProPublica looked at the scores given to white people and black people and then whether the predictions were correct (by looking at whether they actually committed or didn’t commit crimes); they found that in Broward County, Florida, which was using software from a company called Northpointe, black people were mislabeled with high scores and that white people were more likely to be mislabeled with low scores.

"Fits the profile" is the new "fits the description" -- something that seems predisposed to putting blacks behind bars more frequently and for longer periods of time. Eric Loomis tried to challenge his COMPAS score but got nowhere with it, as the math behind it is locked up by Northpointe, which claims giving a defendant access to its trade secrets would pose a serious risk to its profitability.

Loomis argued that not giving him access posed a serious risk to his freedom. Allowing Northpointe to keep its algorithm secret was a violation of his due process rights, as it presented an unchallengeable score that could be used to keep him locked up longer than the normal range for the criminal activity he was convicted for.

His case went up the ladder to the Wisconsin Supreme Court, which has found [PDF] that defendants being unable to fully challenge a sentencing determination isn't a Constitutional problem.

Ultimately, we conclude that if used properly, observing the limitations and cautions set forth herein, a circuit court's consideration of a COMPAS risk assessment at sentencing does not violate a defendant's right to due process.

We determine that because the circuit court explained that its consideration of the COMPAS risk scores was supported by other independent factors, its use was not determinative in deciding whether Loomis could be supervised safely and effectively in the community. Therefore, the circuit court did not erroneously exercise its discretion. We further conclude that the circuit court's consideration of the read-in charges was not an erroneous exercise of discretion because it employed recognized legal standards.

Accordingly, we affirm the order of the circuit court denying Loomis's motion for post-conviction relief requesting a resentencing hearing.

The downside of this decision is that Northpointe cannot be forced to hand over its algorithm for examination by criminal defendants. The upside is that the court has issues with using COMPAS scores to determine sentence lengths.

[T]he opinion comes with some interesting caveats about things judges need to keep in mind when using risk scores in sentencing decisions: The two most important factors they’re asked to keep in mind is that software has been found to be racially biased and that the software needs to be constantly monitored and updated with new information. (If you’re relying on data from five or ten years ago, it’s not going to be accurate.)

The court also notes in passing that the software was never intended to be used to determine sentence lengths. It was supposed to used by the Department of Corrections to assess risks posed by parolees or those requesting parole. But it does not go so far as to forbid the use of COMPAS scores in sentencing decisions. Nor does it suggest that opening up the algorithm for examination might bring much-needed transparency to the sentencing process. Instead, the Supreme Court says judges must walk a very fine line when utilizing COMPAS scores.

The queasiness that judges feel about algorithmic risk-assesment is reflected in the concurring opinion filed by Justice Patience Drake Roggensack. “Reliance would violate due process protections,” she writes. “Accordingly, I write to clarify our holding in the majority opinion: consideration of COMPAS is permissible; reliance on COMPAS for the sentence imposed is not permissible.

Unless a whole lot of judicial explanation accompanies every sentencing decision utilizing a COMPAS score, it's going to be almost impossible for defendants to tell whether a judge has just "considered" Northpointe's presentence investigation reports… or "relied" on them. Any sentence not hitting the upper end of the software's recommendations could be viewed as mere "consideration," even if "reliance" might be a more accurate term.

Without being allowed to closely examine COMPAS scores, defendants still aren't being given a chance to challenge any erroneous information that might be included in these reports. The court's reluctance to fully endorse the use of the software in sentencing decisions is a step forward, but it still allows judges to hand down sentences based on secret formulas that have already shown a predilection for recommending longer sentences to certain demographic groups.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: algorithms, compas, due process, sentencing, wisconsin


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Mason Wheeler (profile), 2 Aug 2016 @ 8:08am

    The two most important factors they’re asked to keep in mind is ... and that the software needs to be constantly monitored and updated with new information. (If you’re relying on data from five or ten years ago, it’s not going to be accurate.)

    I have to wonder about that particular point. Are they really implying that they believe human nature is going to change that much in 5-10 years?

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 2 Aug 2016 @ 9:43am

      Re:

      They have implemented a self fulfilling prophecy, start with a bias and the data will reinforce that bias as it builds up.

      link to this | view in chronology ]

    • identicon
      Anonymous Coward, 2 Aug 2016 @ 9:49am

      Re:

      The data may involve things that do change in five or ten years. Some of this type of software accounts for what kind of neighborhood a defendant lives in. Neighborhood demographics and crime rates can change. Someone who lives in a neighborhood that gentrified in the last five years but was previously riddled with crime might get a harsher sentence because they are believed to live in a place where crime is likelier to occur.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 2 Aug 2016 @ 11:14am

        Re: Re:

        You seem to be very comfortable with it, is there any reason for this?

        I would think the punishment for a crime should not be a variable dependent upon social standing, or any other unrelated data. But we have seen relaxed sentencing simply because the perp is from a rich family, and then said perp skipped home curfew for a vacation in Mexico.

        Apparently family income is not a good indicator of recidivism, and yet it is used to predict whether the perp will be a problem in the future. Go figure.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 2 Aug 2016 @ 1:56pm

          Re: Re: Re:

          Did you reply to the wrong post? Nothing I said implied I was comfortable with it. I just tried to offer an explanation why it would be necessary to have up-to-date data if you're going to use this software for any purpose.

          That speaks nothing to the fact that I don't think this software should be used at all by judges. It removes the reason for having human judges in the first place. That said, human judges have been wrong (see the insufficient sentencing of Brock Turner or Ethan Couch) and need better oversight and accountability apparently.

          link to this | view in chronology ]

    • icon
      James Burkhardt (profile), 2 Aug 2016 @ 12:07pm

      Re:

      The software does not measure inhearent nature. Its collects data about neighborhood, race, previous accusations and convictions, and statisticly models you. 5-10 years of data can significantly change that model.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 3 Aug 2016 @ 6:24am

        Re: Re:

        What happens when you just moved in to bad neighborhood from faraway place? Does other people crime rate determines an individual's?

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 3 Aug 2016 @ 6:52pm

          Re: Re: Re:

          What happens when you just moved in to bad neighborhood from faraway place? Does other people crime rate determines an individual's?

          Of course!

          link to this | view in chronology ]

    • identicon
      Gemma, 2 Aug 2016 @ 12:16pm

      Re:

      Are they really implying that they believe human nature is going to change that much in 5-10 years?

      If, for example, one of the factors determining your "nature" is your neighborhood's criminal history then, yes, it most certainly can.

      link to this | view in chronology ]

  • icon
    JBDragon (profile), 2 Aug 2016 @ 9:54am

    I see this as so wrong, allowing a computer program to dictate a persons life, freedom, based on??? There should be real people making these judgments, not a software program, no matter what the case.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 2 Aug 2016 @ 10:18am

      Re:

      The use of the software isn't the real problem. The real problem is that we can't look at what it's doing. We can't know on what factors the results were made.

      I'd be perfectly happy leaving a lot of decisions in government up to a computer algorithm so long as that algorithm were fairly designed and we could examine it and see as much. It would be much preferable to semi-corrupt persons with their conflicts of interests and biases.

      But this situation is disgusting. It's a clear violation of not being able to examine the evidence and testimony against you.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 2 Aug 2016 @ 10:29am

        Re: Re:

        It's a clear violation of not being able to examine the evidence and testimony against you.

        While I agree with the general point of your post that this is a bad ruling and we should be able to inspect these algorithms, this is actually in a legal gray area. There is plenty of testimony during the sentencing phase that a defendant is not allowed to directly respond to, so it's not without precedent (using the term colloquially here, not legally).

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 2 Aug 2016 @ 1:54pm

          Re: Re: Re:

          Democracy simply does not survive "gray areas" or "secrets" of any kind in law or its enforcement. To pretend that anything about law enforcement or the imprisonment of human beings should even be REMOTELY allowed to have any kind of secrets smacks of a corrupt and/or ignorant mind!

          link to this | view in chronology ]

          • icon
            Richard (profile), 3 Aug 2016 @ 6:12am

            Re: Re: Re: Re:

            Exactly - there used to be a phrase for that "Not only must Justice be done; it must also be seen to be done."

            https://en.wikipedia.org/wiki/R_v_Sussex_Justices,_ex_p_McCarthy

            Use of a secret computer program is in violation of that principle - and the linked case does actually look surpringly similar to the present one.

            A quote from the decision on that case reads "Nothing is to be done which creates even a suspicion that there has been an improper interference with the course of justice. "

            link to this | view in chronology ]

          • identicon
            Anonymous Coward, 3 Aug 2016 @ 9:16am

            Re: Re: Re: Re:

            I wonder if the kind of car the criminal drove has any bearing on the likelihood of repeat offenses!

            link to this | view in chronology ]

            • identicon
              Anonymous Coward, 3 Aug 2016 @ 9:26am

              Re: Re: Re: Re: Re:

              ..or if that person's shoe size is disproportionate to their actual height, maybe they are 'mongroloid' and severely lack common sense or sensible constraint in the eyes of this program's algorithmic logic.

              link to this | view in chronology ]

    • identicon
      David, 2 Aug 2016 @ 10:28am

      Re:

      Sorry, but computer or not, the whole "likely to become a repeat offender" thing makes sense with regard to rehabilitation programs, namely helping the defendant to regain footing.

      It has nothing whatsoever to do with sentencing. You cannot punish people for things they haven't done yet. Not if you pretend that this has anything to do with justice.

      Secret sentencing criteria based on likelihood to reoffend will punish the same deeds harder for people who have a background making it harder to live honestly. So people who more deliberately commit crimes get let off the hook cheaper.

      link to this | view in chronology ]

      • identicon
        Stupidfied, 3 Aug 2016 @ 9:33am

        Re: Re:

        You cannot punish people for things they haven't done yet. Not if you pretend that this has anything to do with justice.

        It does seem to come straight out of a George Orwellian novel and is indeed, draconian at best..

        link to this | view in chronology ]

    • identicon
      Anonymous Coward, 2 Aug 2016 @ 12:41pm

      Re:

      ...perhaps we can call these people "judges" that utilize human reasoning skills.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Aug 2016 @ 10:15am

    what the hell is the USA coming to? what's the next thing that is going to be detrimentally determined by a computer, penalising human beings?carry on like this and it wont be long before the machines take over completely!

    terminator, here we come!!

    link to this | view in chronology ]

    • identicon
      Wendy Cockcroft, 3 Aug 2016 @ 5:43am

      Re:

      Here in the UK we have a TV show called Little Britain. One of the characters, customer service clerk Carol Beer, has a catchphrase, "Computer says no," which is her stock response to queries from the public. This is the mentality we're facing. The possibility of Terminator becoming a documentary is not at issue here, it's "When are we going to stop hiding behind computers when we're being jerks to each other?"

      link to this | view in chronology ]

    • icon
      John85851 (profile), 3 Aug 2016 @ 10:11am

      Re:

      What are we coming to? Can anyone say "Minority Report":

      You're under arrest because the software says you're a 21 year-old black man from a low-income neighborhood.
      But I haven't done anything.
      Not yet, but the software says you will, so we're locking you up before you get the chance to commit a crime.

      link to this | view in chronology ]

  • icon
    Uriel-238 (profile), 2 Aug 2016 @ 10:25am

    Due process is supposed to be transparent.

    A secret algorithm that determines the guilt or innocence of a defendant makes this a secret court.

    Once again Florida doesn't just shamble but bounds its way down the road to tyranny.

    Granted, the US is a shadow of the free nation it once was, but baby steps got us here. Baby steps.

    link to this | view in chronology ]

  • icon
    That One Guy (profile), 2 Aug 2016 @ 10:27am

    Can't have it both ways

    If the software isn't being used to influence the guidelines, and therefore can't be challenged in court, then prohibit it from being used at all. Any use is going to influence the ruling, otherwise why even use it?

    Conversely if the software is going to be used it absolutely should be open for examination and challenge by the defense, 'profits' be damned since we're talking about people's freedom here.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Aug 2016 @ 10:28am

    Even if this software's algorithms are revealed (and they likely will be; "secret evidence" etc.) the issue still raises some big unanswered questions.

    Predicting whether a person is likely to (re)commit violent crimes runs into a serious problem that pits political correctness vs. cold hard reality. Police statistics show a huge (& highly predictable) correlation between a person's race, sex, and age vs. his tendency to commit violent crimes.

    If the goal is to protect individual liberty, then every person must be treated absolutely equally.

    But if the goal is to protect society, then some rather Nazi-esque policies would be needed, such as "preemptively" jailing all inner-city black males between the ages of 15 and 30. There's no question that crime would drop enormously, since that tiny demographic is responsible for a statistically huge amount of crime. The question is whether the concepts of equal-rights and due-process that form the cornerstone of modern Western society would allow such a thing, no matter how scientifically justified.

    link to this | view in chronology ]

    • identicon
      David, 2 Aug 2016 @ 10:36am

      Re:

      The logic behind preemptive jailing would not make it possible to release the jailed persons ever again, so you'd better execute them once they cannot contribute to society any more in the form of labor.

      See how much sense the Nazis made? Can we stop reviving their logic again and again and leave it in the graveyard of history?

      Our justice system is supposed to punish people for what they did, not for who they are.

      link to this | view in chronology ]

    • icon
      Uriel-238 (profile), 2 Aug 2016 @ 10:47am

      Police statistics...

      Are those police statistics based on the current high rate of convenient arrests and almost certain convictions?

      Law enforcement has plenty of tools at their disposal by which to circumvent warrants to conduct unreasonable searches. Dog sniffs, for example can give a false positive up to 94% of the time. You just need the right dog.

      We have more people in jail (per capita) than any other nation. We also have a solid likelihood that the majority of those people are falsely convicted. Plenty of them have major-crime sentences for petty crimes (e.g. possession).

      And then there's the matter that police file reports on what they want to file, and will omit filing on what they don't care about.

      So I wouldn't trust police statistics to be an indicator of how how civilians behave, so much as how police behave.

      link to this | view in chronology ]

      • icon
        Ninja (profile), 2 Aug 2016 @ 11:24am

        Re: Police statistics...

        That. And even if the statistics are right they don't account for the time we've been systematically screwing those who are non-white since like forever (in the West at least). If you grow up being screwed and the only viable option to get some decent life and respect is crime it's no wonder you'll fall for it. And sadly even the black cops are racist sometimes.

        link to this | view in chronology ]

    • icon
      orbitalinsertion (profile), 2 Aug 2016 @ 11:44am

      Re:

      Those police statistics show who the police harass, investigate, and charge. Not who is actually more likely to do anything.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Aug 2016 @ 10:34am

    This is disgusting and this system should be turned inwards to see how likely judges and politicians are to commit crimes.

    link to this | view in chronology ]

  • icon
    Ninja (profile), 2 Aug 2016 @ 10:37am

    So the company can basically set up a random number generator behind the user interface and show the metaphoric middle finger to all the people being screwed by lady bad luck. Or a tyrannical Government could just set a number that fits their needs. Dissident? Set his score so he gets life sentence.

    China approves. I wonder how long till they copy the process and actually manipulate the results.

    link to this | view in chronology ]

  • identicon
    Quiet Lurcker, 2 Aug 2016 @ 10:38am

    Was Daubert decided in vain?

    In the realm of computers and information technology, a program like this qualifies as an expert system - emphasis on expert, if you please - if only loosely. The software, and the people who purvey it claim that the software knows better than any human, which is the same claim made of nearly every bit of expert testimony, ever.

    Since that is the case, shouldn't the defense and courts be able to make an examination, either under Daubert, under Rule 702, or as a matter of common law?

    As it stands, I'm with Anonymous Coward. Use of the software without outside review is a clear and direct violation of the Sixth Amendment.

    link to this | view in chronology ]

  • icon
    JustMe (profile), 2 Aug 2016 @ 11:07am

    Sheesh - could the Court have punted any harder?

    This is why jurors need to be aware of their rights. Just nullification is legal and proper, especially when one group of defendants is treated differently than other groups for the same crimes.

    http://law2.umkc.edu/faculty/projects/ftrials/zenger/nullification.html

    https://www.aclu.org/feature/ fair-sentencing-act

    link to this | view in chronology ]

    • icon
      James Burkhardt (profile), 2 Aug 2016 @ 12:16pm

      Re: Sheesh - could the Court have punted any harder?

      Juries have the POWER to nullify. That comes as a natural extension of the design of our court system. what is debated is if they have the RIGHT to nullify. In several jurisdictions that will get you booted off the jury.

      In either case, Jury Nullification has no baring on judges using software to decide sentencing lengths. You can not nullify sentencing, you'd have to nullify the conviction that happens before the sentencing, and we are talking about a problem that occurs even for those who should be punished.

      link to this | view in chronology ]

      • icon
        Uriel-238 (profile), 2 Aug 2016 @ 12:38pm

        Jury nullification as a protest to the system.

        The unreasonable use of secret judicial software would be good cause to nullify on principle in Florida, but I'm pretty sure there are lots of valid justifications to wrench the system in Florida.

        Because Florida.

        link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Aug 2016 @ 11:18am

    Supposedly we have legislating from the bench
    ... so now we have
    Sentencing by the Corporations. They do not even have to bribe legislators. wtf

    link to this | view in chronology ]

  • icon
    crade (profile), 2 Aug 2016 @ 11:36am

    So they can use the "sentence recommendation software" when deciding the sentence as long as they don't let it bias the sentence. How that is supposed to be possible, as well as what purpose this recommendation has other than biasing the sentence is unaddressed.

    link to this | view in chronology ]

  • icon
    orbitalinsertion (profile), 2 Aug 2016 @ 11:49am

    Never mind examining the algorithm, or how up to date the information is, but what information, how does it weigh, and how comprehensive is it?

    Something like that could be hugely useful if done correctly. Not so much for sentencing unless except for mitigation. But "correctly" is something we fail to do quite often. (And then hide it, deny it, and make money from it. And further use it inappropriately on top of that.)

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Aug 2016 @ 12:22pm

    What we really need

    What we really need is application to evaluate judicial competence and bias. We could then cleanup our court system.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Aug 2016 @ 3:11pm

    America's version of capitalism is truly shocking. It permeates everything. Any common sense restrictions are hysterically labeled "socialism".

    Go ask one of your KFC professors of health policy. On second thought, don't.

    link to this | view in chronology ]

    • identicon
      Wendy Cockcroft, 3 Aug 2016 @ 5:50am

      Re:

      If you've ever wondered why...

      http://reclaimdemocracy.org/powell_memo_lewis

      Powell just wanted to entrench capitalism as the dominant ideology but right-wing activists took the memo and ran with it in directions he never envisaged. Result: activist judges, etc., pushing a fascist line in the courts and neoliberalism presented as a middle-ground position. I never realised how all-pervasive the infiltration was until recently. It's all down to Powell, people.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Aug 2016 @ 3:12pm

    It appears to me to be more a violation of the 8th amendment than the 4th. That is, the courts have ruled that the 8th amendment prohibits punishments that are disproportionate to the crime. If you are incapable of explaining why a defendant received a certain punishment, then it is impossible to show that the punishment is in proportion to the crime.

    link to this | view in chronology ]

    • icon
      Uriel-238 (profile), 2 Aug 2016 @ 3:54pm

      Disproportionate punishment

      As much as I believe disproportionate punishment should be unconstitutional, I can see how the Eighth can be interpreted to at least not say that.

      Mandatory minimums have done nothing but make for disproportionate punishments.

      Of course the rule against cruel punishment should outlaw all the state and federal penal systems in the US. I'm not sure why it doesn't. Not enough abuse?

      link to this | view in chronology ]

  • identicon
    Rekrul, 2 Aug 2016 @ 3:47pm

    Pre-crime.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 2 Aug 2016 @ 4:23pm

    Even if you can inspect the algorithm, the one your looking at isn't neccessarily the one used.

    Given variations in code due to software maintenance, or even virus or malware infection there is likely a number of problems related to code auditing that apply.

    Not to mention the fat finger in the source data that seems to always show up after the warden gets a call from the governor, right?

    So it isn't just validation of the algorithm, but validation of the input data, which is I can almost guarantee will be badly tainted.

    There will be socioeconomic factors that are reflected in the data that are not within the scope of the defendants control. That by itself, means the court is distinguishing formally based on factors that could be regarded as social class.

    But I think that isn't the point.

    My guess is that COMPAS is mostly about externalizing liability. What they are doing is slopping a bunch of tech around, so that if they get called into court they can say "buh, buh, buh da COMPUTER said so!". Which would be why it is important to insure that nobody knows what the computer is doing.

    It is bad science. Reminiscent of eugenics in the early 1930s. They would do themselves a big favor by taking a hard look at this before it snowballs.

    link to this | view in chronology ]

    • icon
      Uriel-238 (profile), 2 Aug 2016 @ 4:50pm

      The SCIENCE of eugenics worked.

      Eugenics was just selective breeding. The problem is choosing which traits to favor and which traits to disfavor, which the Germans did on racial / mythical lines, and proved it's too easy for the elites to bias.

      But yes, if when they create a black box and give that box authority, it gives the appearance of displacing liability.

      The problem is, of course, that people condemned by the black box are not going to accept that it is to blame, so much as the institution --and specific officer-- that pushed it to the black box.

      After the black box has incriminated some thousand or so people, I wonder if Florida judges will develop an attrition problem.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 3 Aug 2016 @ 3:50am

    Court Packing

    One more result of Gov. Scott Walker's court packing efforts.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 3 Aug 2016 @ 5:16am

    Secret court may be legal in Wisconsin, but this fails us constitutional muster.

    link to this | view in chronology ]

  • identicon
    Willie Prevail, 3 Aug 2016 @ 9:03am

    The real culprit stand

    We are not all equal in the eyes of the law. This puts yet another black eye on the face of justice and of the trust one could surmise of those workings in the justice system.

    Somehow, I still blame the bad guy committing multiple crimes knowingly who attempt to finagle the system who make it bad for everyone else when it might be their turn to seek justice.

    link to this | view in chronology ]

  • icon
    Padpaw (profile), 3 Aug 2016 @ 12:46pm

    secret courts with secret evidence then? Why even bother having a court system. Just have the police execute people on sight. They already do that, but now they can have even more incentive too.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.