Detroit Police Chief Says Facial Recognition Software Involved In Bogus Arrest Is Wrong '96 Percent Of The Time'

from the good-to-have-that-all-out-in-the-open dept

The law enforcement agency involved with the first reported false arrest linked to facial recognition software is talking about its software. The Detroit Police Department -- acting on a facial recognition "match" handed to it by State Police investigators -- arrested resident Robert Williams for allegedly shoplifting watches from an upscale boutique.

Williams did not commit this robbery. He even had an alibi. But the investigators weren't interested in his answers or his questions. They had a lo-res screen grab from the store's CCTV camera -- one that software provided by DataWorks Plus said matched Williams' drivers license photo. Thirty hours later, Williams was cut loose by investigators, one of which said "I guess the computer screwed up" after rewatching the camera footage with Williams present.

The officers ignored the bold letters on top of the "match" delivered by the software. The writing said "This document is not a positive identification." It also said the non-match was not "probable cause for arrest." Unfortunately for the misidentified Michigan resident, the cops who arrested him treated the printout as both: positive identification and probable cause.

The policies governing law enforcement's use of this tech have changed since Williams' arrest in January. Under the current policy, lo-res images like the one that led to this arrest are no longer allowed to be submitted to the facial recognition system. That fixes a very small part of the problem. The larger problem is that the tech is mostly good at being bad. This isn't a complaint from critics. This comes directly from the top of the DPD.

In a public meeting Monday, Detroit Police Chief James Craig admitted that the technology, developed by a company called DataWorks Plus, almost never brings back a direct match and almost always misidentifies people.

“If we would use the software only [to identify subjects], we would not solve the case 95-97 percent of the time,” Craig said. “That’s if we relied totally on the software, which would be against our current policy … If we were just to use the technology by itself, to identify someone, I would say 96 percent of the time it would misidentify."

So, it's a bad idea to rely on the software and nothing else when attempting to identify criminal suspects. What happened to Robert Williams supposedly should never happen again… according to policy. But there's the question of why the PD still uses the software if it's not adding much value to the investigative process. It seems officers don't care much for the tech and yet it's still in use. The police chief thinks it's still worth keeping around, even if it has yet to show any positive results.

Craig and his colleague, Captain Ariq Tosqui, said that they want to continue using facial recognition because they say it can be a tool to assist investigators even if it doesn’t often lead to arrest. But even when someone isn’t falsely arrested, their misidentification through facial recognition can often lead to an investigator questioning them, which is an inconvenience at best and a potentially deadly situation at worst. According to Tosqui, the technology has been used on a total of 185 cases throughout the years. “The majority of the cases the detective reported back that [the software] was not useful.”

What it can do, apparently, is add another layer of surveillance on top of what's already in place. And it will do what most law enforcement surveillance creep does: put more eyes on Black residents. Part of the mild reforms Detroit enacted -- rather than implement a facial recognition ban -- mandate the publication [PDF] of facial recognition data by the department. Since the beginning of this year, the PD has used the tech 70 times. All but two of those instances involved a Black resident's photos being submitted.

One of the main sources for uploaded photos is the Project Green Light (PGL) network of cameras [PDF]. These are installed in or around businesses that participate in the project. Participation comes at a cost: $1-6,000 for the initial investment and $1,600 a year in video storage fees. The 550 participants' cameras are supposedly monitored 24/7 at the PD's "real time crime center." Incoming footage can be subjected to other surveillance tech, like facial recognition software and automatic license plate readers.

Here's the panopticon the city wanted:

The City of Detroit put forth a Request for Proposals for a contractor to work closely with the city, DPD, and Motorola (Company that help set up the RTCC) to set up a “turn-key” facial recognition system that would work with the already existing infrastructure of the RTCC. They specifically asked that the facial recognition work on at least 100 concurrent real-time video feeds, be integrated into the PGL system, and can be used by officers with a mobile app.

It got the software it needed from DataWorks: a blend of two algorithms known as "Face Plus." It's scary stuff.

Face Plus is capable of automatically searching all faces that enter camera frames against photos in the entity’s database, alerting authorities to any algorithmic matches. Additionally, there is a “watchlist” option where persons of interest can be monitored and alerted for…

According to DataWorks Plus, in 2017, this repository contained 8 million criminal pictures and 32 million “DMV” pictures. As the Free Press reported in March 2019, almost every Michigan resident has a photo of them in this system.

This is the system the police chief calls 96% inaccurate and DPD investigators call useless. Perhaps this false arrest, which has made national news, will put the brakes on the planned expansion of PGL and Face Plus to all public transit stops and vehicles. Or maybe it will convince the city it's paying too much for something that barely works and take facial recognition away from the PD until something better -- and less biased -- comes along.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: detroit, detroit pd, facial recognition, police, robert williams
Companies: dataworks plus


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    That One Guy (profile), 2 Jul 2020 @ 2:19am

    'Can't blame us, it's the software's fault.'

    If the goal is more efficient use of time and resources then software throwing out almost nothing but false positives is a terrible idea, so either the department is run by idiots or the software is being used for something else.

    Given an almost 100% failure rate will ensure a lot of wasted time the only reason I can think to keep using it is as a scapegoat, a way to excuse any bogus investigations and/or arrests by claiming that since the software noted a match they were just following up, 'just to be sure'.

    link to this | view in thread ]

  2. icon
    Upstream (profile), 2 Jul 2020 @ 5:03am

    Facial recognition software is garbage!

    Facial recognition software is so full of nonsense as to be farcical. The inherent, often racial, biases built into the systems are widely known. And while Detroit Police Chief James Craig admits the software is junk, most do not, particularly the companies selling the software (I'm especially looking at you, Clearview and Hoan Ton-That). One of the many holes in their bogus claims of facial recognition software's accuracy is a logical fallacy called base rate neglect. It is counter-intuitive, and involves some math, but you can find some good explanations of it here and here. Bottom line;tl;dr While the apparently high numbers claimed for facial recognition software's accuracy may be technically true, they mask just how horrible the stuff really is. It's one of those "lying with statistics" things.

    link to this | view in thread ]

  3. icon
    Rohit (profile), 2 Jul 2020 @ 5:07am

    police has develop new software

    here police are taking new software that can take our personal data or not

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 2 Jul 2020 @ 5:39am

    Or maybe it will convince the city it's paying too much for something that barely works and take facial recognition away from the PD until something better -- and less biased -- comes along.

    I'm sorry? Where did the "biased" come from?

    Perhaps:

    Or maybe it will convince the city it's paying too much for something that barely works and take ... away ... the PD until something better -- and less biased -- comes along.

    Is that what you meant?

    link to this | view in thread ]

  5. identicon
    Kitsune106, 2 Jul 2020 @ 6:00am

    Hmmmm

    So, if facial recongiation is good enough to work with police work... can we use it to I.D. police who did not have name badges? I mean.. if its good enough to be used on the public, its good enough to be used on the police.. rigth?

    link to this | view in thread ]

  6. icon
    TheResidentSkeptic (profile), 2 Jul 2020 @ 6:14am

    TV shows are supposed to be fantasy...

    This RFQ "...Request for Proposals for a contractor to work closely with the city, DPD, and Motorola (Company that help set up the RTCC) to set up a “turn-key” facial recognition system that would work with the already existing infrastructure of the RTCC. They specifically asked that the facial recognition work on at least 100 concurrent real-time video feeds, be integrated into the PGL system, and can be used by officers with a mobile app ..."

    Seems to come right out of Person of Interest. Someone took it a little too seriously and really wants their own "The Machine"

    link to this | view in thread ]

  7. identicon
    Pixelation, 2 Jul 2020 @ 9:19am

    Precrime

    Precrime is here yesterday. It just doesn't work. But , don't worry, you're being watched too!

    link to this | view in thread ]

  8. identicon
    crogs, 2 Jul 2020 @ 10:18am

    Facial recognition in China can now recognize foreigners with sunglasses on, and a mask on (said foreigners ) chin....

    Be careful what you wish for chicken Little.

    “Craig and his colleague, Captain Ariq Tosqui, said that they want to continue using facial recognition because they say it can be a tool to assist investigators even if it doesn’t often lead to arrest.”

    Gotta love how the ADL and its spawn have managed to “product place ” every foreign sounding name into US headlines though....

    That sure didnt work out well for that Somali cop in Minneapolis, who shot the white privilege -banging -on -squad -car woman.

    I bet Offcr Craig will be our next Derek Chauvin.

    Any takers?

    The Bronfman, etc. klan/syndicate rolls like that.

    link to this | view in thread ]

  9. icon
    Coyne Tibbets (profile), 2 Jul 2020 @ 10:36am

    Police are the bad guys

    An inaccuracy of 94%? I could do better with a Ouija Board. I could do better throwing darts at a Michigan map blindfolded.

    Let's say that what he means is that 94% of matches generated prove to be false. Now I'm sure that's not what the sales pitch said, but who trusts those? If I had a piece of software that failed 94% of the time, I would ashcan it and a lawsuit against the vendor would follow. Not the police department, no siree. It must be meeting some need of theirs.

    What could that be? In this case, it probably produced seventeen "wild geese." Which brings us to the problem of picking out the Robert Williams, the goose. I wasn't there, but I'm betting it went something like this, as they looked through the seventeen candidates: "Too rich. Too sympathetic. Too professional. Too connected. Wait...here's a one that's poor...don't you think he looks right?"

    Which brings us to the eyewitness. Of course, the police probably helpted that along as well... "Which one of these six men is the shoplifter? Not sure? Which one looks most like him? Still not sure? Well did you look at #4?" [wink, wink, nudge, nudge]

    So now they have a candidate, and credit for the collar...and who cares about the goose's -- Robert William's rights? They probably figured he did something even if he was innocent. They certainly didn't expect him to bond out, and probably expected him to plead guilty to avoid a trial.

    Who wants to waste time finding real criminals when convicting an innocent is so easily?

    The point of all of this is...the bad guys here are the police. Yes the software is crap, but it probably wouldn't exist if if the police everywhere weren't so gung ho about it. It wouldn't still be installed in Detroit and being used, if police there didn't regard it as good enough. It comes down to the same old story: Police violating civil rights wholesale, and looking to expand the franchise -- with just a pinch of facial recognition companies to help them out.

    link to this | view in thread ]

  10. icon
    Uriel-238 (profile), 2 Jul 2020 @ 1:03pm

    This reminds me of Chicago police dogs

    Detection dogs in Chicago will false-positive signal when sniffing Latins at a rate in the 90%+ range.

    And yet they're still a legal viable gateway to probable cause.

    Given a false positive facial recognition led to an arrest, it sounds like it's the same thing.

    Next thing they're going to start using Michael Shermer's dousing rods.

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 2 Jul 2020 @ 3:11pm

    It's awful, but we want to continue using it, i mean, paying large amounts of cash to some company. Hopefully one day we will make facerec scary accurate and track everyone 24-7, generally for non-criminal purposes. Unless we want to create a criminal out of someone.

    Then there's this: $1,600 a year in video storage fees. Lolwut? GTAFO.

    link to this | view in thread ]

  12. identicon
    Anon, 3 Jul 2020 @ 7:35am

    Re: Facial recognition software is garbage!

    Kind of like the old polygraph test and its younger cousin, the "Voice Stress Analyzer"-

    link to this | view in thread ]

  13. icon
    nasch (profile), 7 Jul 2020 @ 7:48am

    Re:

    Facial recognition in China can now recognize foreigners with sunglasses on, and a mask on (said foreigners ) chin....

    According to who, the Chinese Communist Party? Or someone credible?

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.