Detroit Skating Rink Horns In On Detroit PD's Facial Recognition Gaffe Racket, Denies Teen Girl Opportunity To Skate
from the even-if-our-software-is-wrong,-we-will-still-enforce-its-decision dept
It looks like Detroit, Michigan is trying to corner the market on bad facial recognition tech. The city's police department is already associated with two false arrests based on bad matches by facial recognition software. This latest news, via Techdirt reader Jeffrey Nonken, shows mismatches aren't just limited to the public sector.
A Black teenager in the US was barred from entering a roller rink after a facial-recognition system wrongly identified her as a person who had been previously banned for starting a fight there.
Lamya Robinson, 14, had been dropped off by her parents at Riverside Arena, an indoor rollerskating space in Livonia, Michigan, at the weekend to spend time with her pals. Facial-recognition cameras installed inside the premises matched her face to a photo of somebody else apparently barred following a skirmish with other skaters.
The teen told staff it couldn't possibly be her since she had never visited the venue before. But it didn't matter to the management of the rink, which is located in a Detroit suburb. She was asked to leave and now her parents are considering suing the rink over the false positive. Fortunately, no one at the rink felt compelled to call the police, which likely wouldn't have helped anything considering local law enforcement's track record with faulty facial recognition search results.
As for Riverside Arena, it's apologetic but not exactly helpful. Management claims deploying facial recognition tech on patrons is part of the "usual" entry process. It also unexplained that it's "hard to look into things when the system is running," which I suppose means that's why no one could double-check the match while Robinson was still there contesting the search results.
Being wrong some of the time is also good enough for non-government work, apparently.
"The software had her daughter at a 97 percent match. This is what we looked at, not the thumbnail photos Ms. Robinson took a picture of, if there was a mistake, we apologize for that."
Obviously, there was a mistake. So, this should have just been an apology, not a half-hearted offer of an apology if, at some point in the future, someone other than the people directly affected by this automated decision steps forward to declare the mismatch a mismatch.
This is the other side of the facial recognition coin: private sector use. This is bound to result in just as many mismatches as government use does, only with software that's perhaps undergone even less vetting and without any direct oversight outside of company management. False positives will continue to be a problem. How expensive a problem remains to be seen, but since private companies are free to choose who gets to use their services, lawsuits probably won't be much of a deterrent to deploying and using unvetted software that will keep the wrong people out and, perhaps more disturbingly, let the wrong people in.
Filed Under: detroit, facial recognition, livonia, michigan, skating rink
Companies: riverside arena