Las Vegas Police Are Running Lots Of Low Quality Images Through Their Facial Recognition System
from the that's-going-to-end-badly dept
Even when facial recognition software works well, it still performs pretty poorly. When algorithms aren't generating false positives, they're acting on the biases programmed into them, making it far more likely for minorities to be misidentified by the software.
The better the image quality, the better the search results. The use of a low-quality image pulled from a store security camera resulted in the arrest of the wrong person in Detroit, Michigan. The use of another image with the same software -- one that didn't show the distinctive arm tattoos of the non-perp hauled in by Detroit police -- resulted in another bogus arrest by the same department.
In both cases, the department swore the facial recognition software was only part of the equation. The software used by Michigan law enforcement warns investigators search results should not be used as sole probable cause for someone's arrest, but the additional steps taken by investigators (which were minimal) still didn't prevent the arrests from happening.
That's the same claim made by Las Vegas law enforcement: facial recognition search results are merely leads, rather than probable cause. As is the case everywhere law enforcement uses this tech, low-quality input images are common. Investigating crimes means utilizing security camera footage, which utilizes cameras far less powerful than the multi-megapixel cameras found on everyone's phones. The Las Vegas Metro Police Department relied on low-quality images for many of its facial recognition searches, documents obtained by Motherboard show.
In 2019, the LVMPD conducted 924 facial recognition searches using the system it purchased from Vigilant Solutions, according to data obtained by Motherboard through a public records request. Vigilant Solutions—which also leases its massive license plate reader database to federal agencies—was bought last year by Motorola Solutions for $445 million.
Of those searches, 471 were done using images the department deemed “suitable,” and they resulted in matches with at least one “likely positive candidate” 67% of the time. But 451 searches, nearly half, were run on “non-suitable” probe images. Those searches returned likely positive matches—which could mean anywhere from one to 20 or more mugshots, all with varying confidence scores assigned by the system—only 18% of the time.
Fortunately, low-quality images seemingly rarely return anything investigators can use. (Although that 18% is still 82 "likely positive matches...") If the system did, we'd be seeing far more bogus arrests than we've seen to this point. Of course, prosecutors and police aren't letting suspects know facial recognition software contributed to their arrests, so courtroom challenges have been pretty much nonexistent.
Although most of the information in the documents is redacted -- making it difficult to verify LVMPD claims about the software's contribution to arrests and prosecutions -- enough details remained to provide a suspect facing murder charges with information the LVMPD had never turned over to him or admitted to in court.
Clark Patrick, the Las Vegas attorney representing [Alexander] Buzz, told Motherboard that neither the LVMPD nor the Clark County District Attorney’s office ever informed him that investigators identified Buzz as a suspect using, at least in part, facial recognition technology. The Clark County District Attorney’s office did not respond to an interview request or written questions.
Had this information been given to Buzz and his attorney at the beginning of the trial, he likely would not have waived his right to a preliminary evidentiary hearing. If this had taken place -- along with knowledge of a private company's contribution to the investigation -- prosecutors may have had to produce information about the tech and the surveillance footage it pulled images from.
The documents don't appear to show a reliance on low-quality images to make arrests, but they do show investigators will run nearly any image through the software to see if it generates some hits. The precautions taken after this matter most. If investigators are only considering matches to be leads, it will head off most false arrests. But if investigators take shortcuts -- as appears to have happened in Detroit -- the outcome is disastrous for those falsely arrested. A person's rights and freedoms shouldn't be at the mercy of software that performs poorly even when given good images to work with. The use of this software is never going to go away completely, but agencies can mitigate the damage by refusing to treat matches as probable cause.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: facial recognition, las vegas, law enforcement, police
Reader Comments
Subscribe: RSS
View by: Time | Thread
Hmm... Mr Brady might want to join Mr Patrick in having a chat with the prosecutor in front of the judge.
[ link to this | view in thread ]
Nothing here, moving on
Back in the 1950's when every newspaper in South America was 'Classified' and restricted in the US, one paper on the west coast (That west coast, not this one) published the same photo whenever an Asian man was the subject of a news story. Since all US Peace Officers are convinced (a verb) that all of us are in some way criminal, and could be dangerous, any photo will do.
[ link to this | view in thread ]
only leads...
And if that's all they have to go on, they'll treat it as if it's concrete proof of the crime!
After all, they must stop crime and if this is all the 'evidence' they have, then it must be followed to conclusion!
[ link to this | view in thread ]
Until..
You install a Chip with a DNA sample to be registered ON ENTRY to any business..
There is no Perfect solution.. N
NOT even near, a good Solution.
Until you can give a computer the ability to discern racial differences.. and be at least 99% correct, you wont get more then 10%.(IMO)
But also, GET BETTER CAMERA'S..
Tons of companies that sell BASIC, security devices dont tell you they have CRAP camera's.
I dont think MOST of them even know the restrictions OF the camera's. LIKE: never point to a SUN FILLED window. You cant see much of anything.
[ link to this | view in thread ]
Re: Until..
Shhh! Don't tell everybody!
[ link to this | view in thread ]
Re: Nothing here, moving on
Yeah you are right!
vegas weed delivery
⋮
[ link to this | view in thread ]