Detroit PD Detective Sued For His (Second) Bogus Arrest Predicated On Questionable Facial Recognition Searches
from the fool-yourself-once... dept
On January 9, 2020, facial recognition tech finally got around to doing exactly the thing critics had been warning was inevitable: it got the wrong person arrested.
Robert Williams was arrested by Detroit police officers in the driveway of his home. He was accused of shoplifting watches from a store on October 2, 2018. The store (Shinola) had given Detroit investigators a copy of its surveillance tape, which apparently was of little interest to the Detroit PD until it had some facial recognition software to run it through.
This was the dark, grainy image the Detroit PD felt was capable of returning a quality match:
That picture is included in Williams' lawsuit [PDF] against the Detroit Police Department. Even in the best case scenario, this picture should not have been uploaded to run a search against. It's low quality, poorly-lit, and barely shows any distinguishing facial features.
What makes it worse is that all facial recognition AI -- across the board -- performs more poorly when attempting to identify minorities. That's the conclusion reached by an NIST study of 189 different algorithms. It's not just some software. It's all of it.
The Detroit PD chose to run with that photo. Then it decided the search results it had were close enough to probable cause to effect an arrest, even though the software used stated clearly search results should not be used this way. The search was performed by the Michigan State Police from the grainy image submitted by the Detroit PD. A report was returned but investigators were cautioned against trying to turn this into probable cause:
The following statement appeared prominently on the Investigative Lead Report, in the form shown: “THIS DOCUMENT IS NOT A POSITIVE IDENTIFICATION. IT IS AN INVESTIGATIVE LEAD ONLY AND IS NOT PROBABLE CAUSE TO ARREST. FURTHER INVESTIGATION IS NEEDED TO DEVELOP PROBABLE CAUSE TO ARREST.” The phrase “INVESTIGATIVE LEAD ONLY” was highlighted in red ink.
The report was also light on any other details that might have indicated Robert Williams was actually the shoplifter in question.
The Investigative Lead Report contains neither the “score” generated by the facial recognition system representing the level of confidence that Mr. Williams’s photo matched the probe image, nor the other possible matches that, upon information and belief, should have been returned by the system.
The Detroit Police Department did not attempt to ascertain the “score” generated by the facial recognition search nor request the other possible matches to the probe photo.
Two months after the PD obtained these search results, the investigation was turned over to another detective, Donald Bussa. At the point he assumed control of the investigation, Bussa was supposed to be operating under the PD's new facial recognition policy that acknowledged the limitation of the tech and stated search results would need to be peer reviewed to ascertain their accuracy. This didn't happen.
Defendant Bussa, however, ignored the new policy. Even though the facial recognition search “identifying” Mr. Williams as the shoplifter was generated by a woefully substandard probe image and had never been peer reviewed by DPD officers, as required by the new policy, Defendant Bussa decided to rely on the lead anyway.
Bussa assembled a "six pack" of suspect photos that contained an image taken from Williams' expired drivers license. (Before this investigation took place, Williams had secured a new license and an updated photo.) He tried to speak to the staff at Shinola but management refused to cooperate, stating that it was not interested in having its employees appear in court. Unable to speak to the sole eyewitness who had actually conversed with the shoplifting suspect, Detective Bussa decided to bypass Shinola completely.
Defendant Bussa then arranged to conduct a six-pack photo identification with Katherine Johnston. Ms. Johnston, then employed by Mackinac Partners, was contracted by Shinola for loss prevention services.
Defendant Bussa had no legitimate basis whatsoever for asking Ms. Johnston to participate in an identification procedure. Ms. Johnston was not an eyewitness. Ms. Johnston was not in the Shinola store at the time of the incident and has never seen Mr. Williams or the alleged shoplifter in person. Indeed, Ms. Johnston’s sole relation to the incident was that she had watched the same low-quality surveillance video that Detective Bussa possessed.
Bussa sent Detective Steve Posey out with the loaded "six pack" to pretty much guarantee Williams was selected as the prime suspect.
The photo array was not a blind procedure—Posey knew that Mr. Williams was the suspect. Indeed, Posey’s sheet was nearly identical to that given to Ms. Johnston, except that Mr. Williams’s name was printed in red while all other names were printed in black.
Ms. Johnston identified Mr. Williams’s expired license photo as matching the person she had seen in the grainy surveillance footage, and answered the question “Where do you recognize him from?” with “10/2/18 shoplifting at Shinola’s Canfield store.”
With that, Bussa went out to get an arrest warrant and the rest is facial recognition history. He secured this warrant by omitting some key details, like the fact the suspect was picked out of a facial recognition database using a low quality image. It also did not note that the person who picked Williams out of Bussa's loaded lineup wasn't even at the store the day the shoplifting happened. And it didn't mention Bussa's bypassing of Detroit PD policies that were put in place to prevent exactly this sort of false identification.
The lawsuit also points out that the same software and the same two detectives were involved in another false arrest -- one that occurred five months before the PD arrested Williams. Detective Bussa and Detective Posey used unvetted search results to arrest Michael Oliver for an assault he didn't commit. Even if the facial recognition software had done its job accurately (which it didn't), the tech would not have noticed something far more obvious: the suspect's arms (as captured in the phone recording) were unmarked. Michael Oliver's arms are covered with numerous tattoos.
Williams alleges a long list of violations he's hoping to hold Detective Bussa (and his supervisor) accountable for. It's going to be pretty difficult for the detective to argue he operated in good faith in the Williams arrest. After all, he'd already followed the same broken process to falsely arrest someone else months earlier. Then he ignored the PD's policy on facial recognition tech. He also ignored the big, bold warning printed across the search results he obtained from the State Police. And none of this information -- which would have undercut his probable cause assertions -- made its way into his warrant request.
Any reasonable officer would know a lot of what Detective Bussa did was wrong. But Bussa would know this more than even the most reasonable of officers because it wasn't the first time he'd screwed up.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: detroit, detroit pd, detroit police department, donald bussa, facial recognition, michael olivier, michigan state police, robert williams, steve posey
Reader Comments
Subscribe: RSS
View by: Time | Thread
The least he could do...
The detective used a picture that was 4 years old (how long michigan drivers licenses are valid for), when he should have used the current one. And this is the least part of his misconduct.
[ link to this | view in chronology ]
Re: The least he could do...
luv how the judge who issued the actual Arrest Warrant here is given a total pass on any wrongdoing.
the judiciary is supposed to protect the public against this type of police misconduct.
Judges are supposed to know much more about law and rules of evidence and probable cause than cops.
This judge rubber stamped a highly questionable warrant application and is the primary guilty party.
And where was the Detroit City Attorney office in this process? They are the ones who would have to prosecute the case in court, but can't be bothered with any details establishing the case in the first place?
The criminal justice system is corrupt, aside from any technical shortcomings with facial-recognition. That's the main lesson here.
[ link to this | view in chronology ]
Re: Re: The least he could do...
Judges can be pretty bad and rubber-stampey, but none interrogate every last detail of a warrant. "Is this person who identified the suspect a person who was actually present during the commission of the crime?" is not something they should even have to ask. "Did this occur on Earth?"
If cops are lying on warrant, there should be consequences, and judges should be among those free to pursue or deliver outright such consequences. Sometimes they probably can.
[ link to this | view in chronology ]
Re: Re: Re: The least he could do...
...judges are under no obligation or legal pressure to approve any police warrant application.
If a judge has the slightest reasonable doubt about the validity of a warrant application, that judge should legally reject that application.
Judges are not required to question the cop and help him craft an acceptable application.
This was an alleged petty misdemeanor anyway -- guess there were no serious crimes in Detroit for this cop to investigate.
[ link to this | view in chronology ]
Re: Re: The least he could do...
Not exactly. In many jurisdictions, the cops know which judges will sign off on warrants without asking annoying questions. They tend to present warrant applications to those judges.
This practice avoids inconvenience for the police. We hope that it also prevents the public from having any foolish expectation of protection from government overreach.
[ link to this | view in chronology ]
Re: The least he could do...
It could have been worse, in Michigan, you can renew online or by mail once every 8 years. Until I renewed my license last year, my picture was from 2012. After eight years I looked very different.
[ link to this | view in chronology ]
Perjury?
Ms. Johnston lied under oath, since she could not have recognized anyone from the shoplifting incident as she was not present.
Are there no consequences for this?
[ link to this | view in chronology ]
Re: Perjury?
Was that lying under oath, or giving the required statement under duress?
[ link to this | view in chronology ]
"because it wasn't the first time he'd screwed up"
Ah but does twice make it well established enough that he might finally face charges?
[ link to this | view in chronology ]
Beyond the technology angle, I didn't think too much about these cases, when TD reported on them previously. Looking again today, a few rather unpleasant thoughts on this spring definitely to mind.
Now that I consider the matter, I can only presume these cases have drawn a measure of heightened scrutiny because of the deficiencies of the technology used. However, if they're this sloppy all the time -- and always grabbing faces that just happen to vaguely fit the crime -- how many other supposed criminals are still in jail, without such an obvious investigatory failing that would let them push back against similarly-false allegations?
Following on from that -- and considering that both the suspects in these known cases are black -- I have to wonder just how many of the unknown number of falsely-accused suspects are also from ethnic backgrounds. It's hardly a revelation to say that the job of policing attracts all sorts of people, from the very best to the very worst -- including some of the most virulent racists in society.
I don't know how to locate career histories for individual police officers, but if what they've done in these two cases is any indication, then they could well have made decades-long careers for themselves, just by mostly arresting and charging semi-random suspects, purely on the basis of their skin colour.
I'd hope that Detroit PD would look very carefully at the history of these two officers and take the opportunity to get its house in order. Sadly, with American policing being what it is -- especially the world-infamous Detroit PD -- I doubt this will actually happen.
Inferences and suppositions about two questionable officers can only be that -- but all signs suggest to me that there's far more here that's rotten than just the technology.
[ link to this | view in chronology ]
Never ask a machine to do a human's job
Let this be another object lesson of placing blind confidence in a machine that can't even count to two without help.
[ link to this | view in chronology ]
In this case the machine did a much better job though
Yeah, that wasn't blind confidence as the software explicitly said that the match wasn't good enough for an arrest, this was someone who 'knew' he had the right suspect and was going to stack the deck as much as needed to see them arrested and punished.
[ link to this | view in chronology ]
The modern day version of 'reasonable cause on demand'
Funny how facial recognition software seems to be really good at resulting in the wrong people being arrested in that department and by that particular person, but hey, I'm sure it's the software's fault...
[ link to this | view in chronology ]
Re: The modern day version of 'reasonable cause on demand'
This is just the new-tech version of the police dog named "Probable Cause," who "alerted" on the handler's cue.
[ link to this | view in chronology ]
Re: Re: The modern day version of 'reasonable cause on demand'
...with the added "benefit" that facial-recognition tech is ubiquitously and infamously bad at differentiating between faces in darker shades. I mean when you have a 50% chance of not seeing the difference between Prince and Janet Jackson there's going to be a whole lot of false positives conveniently floating around.
Yeah, the concept of contrast loses a lot of leverage if the skin is dark enough shadows blend in a crappy camera in poor lighting. The solution to that should either be building any camera used for facial recognition to specs rising above this problem or have the facial recognition algorithm just say "ambiguous results, I got nothing" when it actually can't clearly recognize the markers.
[ link to this | view in chronology ]
Drug Sniffing dogs
Seems like this tech is a similar tool to drug sniffing dogs, just a way to generate probable cause and make false arrests. The officer was probably approached by a "company representative" and pushed into "proving the software works".
"Don't worry, what could go wrong? We got your back".
Look into the bozos finances.
[ link to this | view in chronology ]
Anyone?
Anyone seen some of the Old camera pictures from Fixed focus, out of exposure range, Sun in wrong position Type pictures?
Anyone ever seen The First Digital camera pic's and vid?
Anyone looked at Even the current range of security cameras, and wonder WTH?
There is 1 real nice trick about many digital cameras. IR lighting.
Every outdoor camera and most indoor will see in IR. and many have IR LED'S. For a reason. and those that Dont see IR have a filter on them so they cant see it. the sensors are Very sensitive to IR, it even damages the sensor over time. But as an Invisible Light it can see Very well. So Why in Hell dont security camera's USE THIS FEATURE? It makes the Pic in Black and white, and very detailed.
But people think COLOR is best, and dont get the idea that you can take pictures in BOTH at the same time.
[ link to this | view in chronology ]
Re: Anyone?
The primary problem with using IR as a surveillance camera is that it also sees through many types of clothes, making them a privacy nightmare for surveillance companies. And you thought there was a fit thrown over millimeter radar in airports...
[ link to this | view in chronology ]
Re: Re: Anyone?
Probably means they are wearing a Very thin synthetic, worth <$1, and sold to them for $10-100.
Its not a porn channel, and the recordings are only used in the time of a crime.
and I really dont think IR will penetrate, much of anything.
https://www.researchgate.net/publication/258196189_Transmittance_of_Infrared_Radiation_Through_Fabr ic_in_the_Range_8-14_m
Consider this is 1 way, and the Loss of Wavelength(65%), even thru 1 layer would make the reflection of Skin almost impossible.
Let alone the absorption of the skin itself, would make this beyond 0. NOW there is a medical thing about this, as you can see Skin damage with this, esp the types that can lead to cancers. With the correct lighting.
[ link to this | view in chronology ]