UK Police Use Zipcode Profiles, Garden Size And First Names For AI-Based Custody Decision System

from the black-box-says-you're-a-bad-person dept

As you have doubtless noticed, Cambridge Analytica has been much in the headlines of late. There is still plenty of disagreement about the extent to which the company's profiling tools provide the kind of fine-grained categorization of people that it claims, and whether it played a significant -- or indeed any -- role in deciding key elections, both in the US and elsewhere. What is not disputed is that such profiling is widely used throughout the online world, mostly to sell ads, and that it is likely to become more accurate as further data is gathered, and analytical techniques are honed. The continuing flow of reports about Cambridge Analytica and related companies has therefore at least served the purpose of alerting people to the important issues raised by this approach. Against that background, news that UK police in the north of England are applying similar techniques is troubling:

Durham Police has paid global data broker Experian for UK postcode [zipcode] stereotypes built on 850 million pieces of information to feed into an artificial intelligence (AI) tool used in custody decisions, a Big Brother Watch investigation has revealed.

Durham Police is feeding Experian's 'Mosaic' data, which profiles all 50 million adults in the UK to classify UK postcodes, households and even individuals into stereotypes, into its AI 'Harm Assessment Risk Tool' (HART). The 66 'Mosaic' categories include 'Disconnected Youth', 'Asian Heritage' and 'Dependent Greys'.

In order to decide whether someone should be charged with an offense, the HART system aims to help the police evaluate whether they are likely to re-offend. "High-risk" offenders are charged. Those with a "moderate" risk of re-offending are offered the option of joining a rehabilitation program; if they complete it successfully, they do not receive a criminal conviction. To build the specialized AI system, the local UK police force has been working with a team of researchers at the University of Cambridge:

Called the Harm Assessment Risk Tool (HART), the AI-based technology uses 104,000 histories of people previously arrested and processed in Durham custody suites over the course of five years, with a two-year follow-up for each custody decision. Using a method called "random forests", the model looks at vast numbers of combinations of 'predictor values', the majority of which focus on the suspect's offending history, as well as age, gender and geographical area.

The basic HART system has been in use since 2016. But Big Brother Watch has discovered that HART has been extended in a significant way through the use of the profiling information acquired from Experian. This Dublin-based company -- not to be confused with Equifax, which works in the same field -- has amassed personal information on hundreds of millions of people around the world. Where things become more problematic is how the profiles that Experian has passed to the Durham police force for its HART system are compiled. As well as using basic zipcodes, a wide range of sensitive "predictor values" are gathered, aggregated and analyzed, such as:

Family composition, including children,
Family/personal names linked to ethnicity,
Online data, including data scraped from the pregnancy advice website 'Emma's Diary', and Rightmove [UK real estate site],
Occupation,
Child [support] benefits, tax credits, and income support,
Health data,
[Children's exam] results,
Ratio of gardens to buildings,
Census data,
Gas and electricity consumption.

The use of first names to help assign people to categories is a striking feature of the approach:

Experian’s 'Mosaic' links names to stereotypes: for example, people called 'Stacey’ are likely to fall under 'Families with Needs' who receive 'a range of [government] benefits'; 'Abdi' and 'Asha' are 'Crowded Kaleidoscope' described as 'multi-cultural' families likely to live in 'cramped' and 'overcrowded flats'; whilst 'Terrence' and ‘Denise' are 'Low Income Workers' who have 'few qualifications' and are 'heavy TV viewers'

By stereotyping people on the basis of where and how they live, there is an evident risk that people will find it harder to escape from more challenging life situations, since those with less favorable stereotypes are more likely to be prosecuted than those with more favorable profiles, thus reducing social mobility.

An additional issue is that the black box nature of the HART system, coupled with the complexity of the 850 million data points it draws on, will inevitably make it very hard for police officers to challenge its outputs. They might disagree with its decisions, but in the face of this leading-edge AI-based approach, it would take a very self-assured and experienced officer to ignore a HART recommendation to prosecute, particularly with the risk that the person might re-offend. It is much more likely that officers will take the safe option and accept the HART system's recommendations, whatever they think. As a result, an essentially inscrutable black box will be making critical decisions about a person's life, based in part on where they live, how big their garden is, and whether they are called "Stacey" or "Terrence".

Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: artificial intelligence, uk
Companies: experian


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 13 Apr 2018 @ 4:16am

    Here in America we have something similar to determine if you go to jail or get off with rehabilitation or no punishment at all, even for the same offense.

    We call it your skin color.

    link to this | view in thread ]

  2. icon
    Roger Strong (profile), 13 Apr 2018 @ 5:04am

    This sort of thing will keep happening until the tables are turned. Eventually someone will use the exact same techniques to come up with a decision system for voting: "Probability that a candidate or public official is corrupt and/or a bigot."

    The same indicators are just as valid: Family/personal names linked to ethnicity, zip code, occupation, tax credits, ratio of gardens to buildings, gas and electricity consumption, etc.

    Politicians and public officials can be rated "high-risk", "moderate", etc., and the results released just before each election.

    The justification is the same as for the police system. The ability to appeal - and the potential Streisand effect of any complaint by an official found to be "high-risk" - would be the same too.

    Any argument against one is an argument against the other.

    link to this | view in thread ]

  3. icon
    justok (profile), 13 Apr 2018 @ 5:24am

    Sorting

    Suspects with a good credit score will be more likely to be offered fines.

    link to this | view in thread ]

  4. identicon
    Wendy Cockcroft, 13 Apr 2018 @ 5:53am

    Re:

    I don't always agree with you, Roger, but when I do...

    Good one!

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 13 Apr 2018 @ 5:55am

    Re: Sorting

    "more likely to be offered fines.".

    And that will be just "fine" with the Chancellor of the Exchequer.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 13 Apr 2018 @ 5:57am

    Re:

    You've got it all wrong. Money factors in to prosecution decisions far more than skin color. The decision tree looks like:

    If non-white, shoot them at the scene.
    If white and no money, prosecute.
    If white and money, apologize for disturbing them.

    link to this | view in thread ]

  7. icon
    Richard (profile), 13 Apr 2018 @ 6:21am

    Re:

    Politicians and public officials can be rated "high-risk", "moderate", etc., and the results released just before each election.

    I've got a feeling that this might just produce exactly the opposite result to that which you are expecting/hoping for.

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 13 Apr 2018 @ 6:51am

    Same type of thing for Child Welfare

    CDC
    https://www.cdc.gov/violenceprevention/childmaltreatment/riskprotectivefactors.html

    Child Abuse and Neglect: Risk and Protective Factors

    Community Risk Factors
    •Community violence
    •Concentrated neighborhood disadvantage (e.g., high poverty and residential instability, high unemployment rates, and high density of alcohol outlets), and poor social connections.

    Community Protective Factors
    •Communities that support parents and take responsibility for preventing abuse

    ------------------

    Other search terms--

    "Structured Decision Making" child welfare complaints

    actuarial statistics child welfare complaints
    ...

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 13 Apr 2018 @ 7:11am

    Now, all we need is an AI system that helps people optimize their HART scores.

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 13 Apr 2018 @ 7:20am

    So what should I name my kid for minimum jail time?

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 13 Apr 2018 @ 7:22am

    Now watch housing prices in better neighborhoods go up and people changing their name. This is similar to sesame credit in China.

    link to this | view in thread ]

  12. icon
    Oblate (profile), 13 Apr 2018 @ 7:43am

    Ratio of gardens to buildings,

    Note that in the UK the term 'garden' is used as the term 'lawn' is used in the US. This only makes it seem a little less crazy.

    link to this | view in thread ]

  13. icon
    Oblate (profile), 13 Apr 2018 @ 7:44am

    Re:

    Rich.

    link to this | view in thread ]

  14. icon
    Roger Strong (profile), 13 Apr 2018 @ 8:15am

    Re: Re:

    No doubt the Breitbarts and WorldNetDailys will publish their own versions. Someone will have a left-biased version. Someone else will have a politically neutral version with more credibility, and that'll upset the most people.

    This would spark a conversation about the bias of those lists. Probably a rather excited and abusive conversation, that demonstrates the flaws with this idea. Flaws that also apply to the police version.

    link to this | view in thread ]

  15. icon
    Jeffrey Nonken (profile), 13 Apr 2018 @ 8:32am

    "Where things become more problematic is how the profiles that Experian has passed to the Durham police force for its HART system are complied."

    That's the problem with spall chuckers: they only tell you when a word is spilled corruptly, not weather it's the right word.

    link to this | view in thread ]

  16. identicon
    Anonymous Coward, 13 Apr 2018 @ 9:06am

    I thought that prior convictions were not supposed to have any relevance in prosecuting a case, but it's ok for sentencing?

    link to this | view in thread ]

  17. identicon
    glyn moody, 13 Apr 2018 @ 9:31am

    Re:

    Thanks, fixed.

    link to this | view in thread ]

  18. identicon
    Anonmylous, 13 Apr 2018 @ 10:42am

    I was going to make a comment, but then realized it will someday be used against me in a court of law by an AI.

    link to this | view in thread ]

  19. icon
    Richard (profile), 13 Apr 2018 @ 10:45am

    Re: Re: Re:

    No doubt the Breitbarts and WorldNetDailys will publish their own versions. Someone will have a left-biased version. Someone else will have a politically neutral version with more credibility, and that'll upset the most people.

    If only it were that simple....

    Unfortunately, in recent history in my country (UK), scare stories about politicians on the left (eg Tony Benn, Michael Foot, Neil Kinnock etc) have been more effective. Problem is that right wing politicians usually have their own money from generations back that looks "clean" on this type of measure - whereas those on the left are more financially challenged and hence more likely to be tempted into dodgier activities.

    a rather excited and abusive conversation, that demonstrates the flaws with this idea. Flaws that also apply to the police version.

    The basic flaw is that no-one understands why it produces the result it does.

    These AI techniques have a long and inglorious history.

    link to this | view in thread ]

  20. icon
    Sharur (profile), 13 Apr 2018 @ 12:02pm

    Re:

    Yes, it is completely acceptable to use prior convictions in sentencing. When prosecuting, you overcoming the presumption of innocence, even if you had prior convictions. After you have been found guilty, your prior convictions can be added consideration to your current conviction.

    It is also acceptable, (technically required, on the part of the judge) for determining bail/release conditions.

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 13 Apr 2018 @ 5:05pm

    Moronic Black Box

    Really this sort of thing should be downright criminal to use as it is literally naked bigotry by how they are judging. It is clearly just 'racism in a box' with a neural network - you can tell by what they judge it upon.

    Even without the whole metaexistance of the judgment as harmful to rehabilitation rates. It is not only illogical but downright immoral and is in essence the kind of sick kafkaesque comedy of barring the homeless from all forms of employment, providing homes to those without jobs, and then judging them as irredeemable because they'd only become thieves and beggar anyway.

    The whole thing amounts to a bigotry laundering business.

    Sadly there is potential for algorithmic laws and judges if it is done in the right way. Say taking into account past criminal records and objective metadata to be relatively fairer. Say severity of crimes, their distribution and others on the trajectory. Say that there is a far lower rate of recidivism for six month sentences for first time assault convictions than those sentenced for six years.

    I save relativity since it cannot take into account people unjustly not prosecuted or acquitted because of money or connections.

    It would best be done completely transparently with open source algorithms. It could also lead to a funny hobby as people can find out interesting quirks like 'it turns out that if you have exactly fourteen unpaid parking tickets and two drunk driving charges in the past you get lower sentences'.

    link to this | view in thread ]

  22. icon
    Coyne Tibbets (profile), 13 Apr 2018 @ 6:43pm

    Oppression pap

    By stereotyping people on the basis of where and how they live, there is an evident risk that people will find it harder to escape from more challenging life situations, since those with less favorable stereotypes are more likely to be prosecuted than those with more favorable profiles, thus reducing social mobility.

    What a nice way to say, "We want to limit jail-sentence prosecutions to those people who can't afford a high-priced barrister."

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 13 Apr 2018 @ 7:08pm

    Re: Re:

    You say money factors in more than skin color and literally the first scenario you lay out is the non-white person being shot.

    link to this | view in thread ]

  24. identicon
    Anonymous Coward, 14 Apr 2018 @ 1:37am

    Re: Moronic Black Box

    It would best be done completely transparently with open source algorithms.

    With many of the A.I. systems, even that does not make how the decisions are made available for inspection, nor does examination of the training set.

    link to this | view in thread ]

  25. icon
    Jono793 (profile), 14 Apr 2018 @ 3:47am

    The decision making tool inevitably becomes the decision-maker

    This happened before with the roll-out of medical assessments for recipients of Employment Support Allowance (A welfare benefit in the UK for people unable to work due to health needs). They introduced a medical assessment to assess whether a claimant was fit for work.

    The validity and accuracy of these assessments has been subject to intense criticism over the years, for reasons I won't go into now. But in theory, this assessment was only supposed to be one piece of evidence. The final decision rested with Department for Work and Pensions.

    In reality, the decision-makers at the DWP are not empowered to deviate from the outcome of the assessment, irrespective of any independent medical evidence submitted by the claimant. So the assessment, while intended as an aid to a decision maker, became the de-facto decision!

    The idea of this happening in criminal justice scares me no end!

    link to this | view in thread ]

  26. identicon
    Jigsy, 14 Apr 2018 @ 7:05am

    Sounds about right for a Police state.

    link to this | view in thread ]

  27. identicon
    Anonymous Coward, 14 Apr 2018 @ 7:06am

    Re:

    Muhammed.

    link to this | view in thread ]

  28. identicon
    Anonymous Cowherd, 14 Apr 2018 @ 8:58am

    A guess is a guess is a guess, even if you add "with a computer"

    Why don't they just toss a coin to decide? Would be cheaper.

    link to this | view in thread ]

  29. icon
    Coyne Tibbets (profile), 14 Apr 2018 @ 9:43am

    Re: Re: Re:

    I'm sure he was thinking that shooting at the scene precludes prosecutor decisions. But you're right, it doesn't. In that case, the prosecutor decides if the cop will be prosecuted.

    link to this | view in thread ]

  30. identicon
    Anonymous Coward, 14 Apr 2018 @ 1:10pm

    Fun fact: when GDPR comes into effect in May 2018, if you have a "bad" credit score you can reset it to neutral by demanding experience and equifax etc remove you from their systems entirely.

    They've already been told they have no "public safety" issues with doing this, as all they do is sell your "credit score" to private companies.

    We can ignore the fact that they downgrade people based on having ethnic surnames / living in "poor areas" without a shred of evidence to back it up and just ask to be utterly erased from their company data! yay!

    link to this | view in thread ]

  31. identicon
    Anonymous Coward, 14 Apr 2018 @ 5:00pm

    Re:

    Yeah, unfortunately the problem with that is that politicians are expected to be lying, corrupt, sociopaths or they wouldn't be politicians to begin with. One of the main requirements of being a career politician is to be able to lie even better than you tell the truth. No I'm not joking. The way you bring a politician down is usually not the sex or campaign contribution scandals. You ask them a question you already know the real answer to and wait for them to lie about it. Bam, you've got them for lying in the course of an investigation. Further investigation not necessary.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.