School Security Software Decides Innocent Parent Is Actually A Registered Sex Offender
from the you-can't-argue-with-(search)-results dept
An automated system is only as good as its human backstop. If the humans making the final judgment call are incapable of using good judgment, the system is useless.
School personnel allowed a machine to do all of their critical thinking, resulting in this unfortunate turn of events.
Staff in an Aurora school office mistakenly flagged a man as a registered sex offender when he and his family went to his son's middle school for a recent event.
Larry Mitchell said he was humiliated Oct. 27 when Aurora Hills Middle School office staff scanned his driver license into a software system used to screen visitors to Aurora Public Schools district schools.
The system, provided by a private company, flagged Mitchell as a potential match with a registered sex offender in a nation-wide database. Staff compared Mitchell’s information with the potential match and determined that match was correct, even though there are no offenders in the national sex offender registry with his exact name and date of birth.
Not only did these stats not match, but the photos of registered sex offenders with the same name looked nothing like Larry Mitchell. The journalists covering the story ran Mitchell's info through the same databases -- including Mitchell's birth name (he was adopted) -- and found zero matches. What it did find was a 62-year-old white sex offender who also sported the alias "Jesus Christ," and a black man roughly the same age as the Mitchell, who is white.
School administration has little to say about this botched security effort, other than policies and protocols were followed. But if so, school personnel need better training… or maybe at least an eye check. Raptor, which provides the security system used to misidentify Mitchell, says photo-matching is a key step in the vetting process [PDF].
In order to determine a False Positive Match the system operator will:
i. Compare the picture from the identification to the picture from the database.
ii.If the picture is unclear, we will check the date of birth, middle name, and other identifying information such as height and eye color.
iii. The Raptor System has a screen for the operator to view and compare photos.
iv. If the person or identifying characteristics are clearly not from the same person, the person will then be issued a badge and established procedures will be followed.
Even if you move past the glaring mismatch in photos (the photos returned in the Sentinel's search of Raptor's system are embedded in the article), neither the school nor Raptor can explain how Raptor's system returned results that can't be duplicated by journalists.
Mitchell said he was adopted, and his birth name is Lawrence Michael Evans. The Sentinel did not find a match with that or his legal name and date of birth in the national sex offender registry.
Raptor says its system is reliable, stating it only returned one false positive in that county last year. (And now the number has doubled!) That's heartening, but that number will only increase as system deployment expands. Raptor's self-assessment may be accurate, but statements about the certainty of its search results are hardly useful.
The company's sales pitch likely includes its low false positive rate, which, in turn, leads school personnel to believe the system rather than the person standing in front of them -- one who bears no resemblance (physical or otherwise) to the registry search results. Mitchell still isn't allowed into the building without a security escort and is hoping that presenting school admins with his spotless criminal background check will finally jostle their apparently unshakeable belief in Raptor's search results.
This failure is also an indictment of the security-over-sanity thinking. The Sentinel asked government officials if there were any incidents in which sex offenders had gained access to schools, thus necessitating this $100,000+ investment in Raptor's security system. No results were returned.
Neither local school or state public safety or education officials could point to data showing how many registered offenders try to seek access to schools, or if a registered offender visiting a school has ever harmed a student in Aurora or Colorado.
Given this history, Raptor's system is always going to be better known -- at least at this school -- for locking out non-criminals than catching sex offenders trying to be somewhere they shouldn't. If the schools haven't seen activity that necessitates the use of this system, it will always produce more false positives than actual hits. When there's no one to catch, you're only going to end up stigmatizing innocent parents. It's a lot of money to pay for solving a problem that doesn't exist. The school has purchased a tiger-proof rock and somehow managed to hurt someone with it.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: ai, aurora hills middle school, fasle positive, larry mitchell, schools, sex offenders
Companies: raptor
Reader Comments
Subscribe: RSS
View by: Time | Thread
And you want to let GOOGLE keep data on everyone?
Or is Google's software perfect, besides its motives, so should be allowed -- and it IS entirely the choice of "natural" persons whether that or any corporate fiction is even written on paper, but we do need to inform our elected servants when it's clearly become a hazard to our privacy -- to track everyone everywhere forever?
[ link to this | view in thread ]
Are they only deployed in 1 county? Or are they trying to keep the numbers low by subdividing incidents into tiny categories to bolster their claims of being awesome?
When "Do Something!!!!!!!!" meets snake oil salesmen.
We picked a boogeyman & did something, at the expense of your children's education. Don't judge us by this absolute failure of our system & users, stay focused on the imagined stories of sex offenders grouping up to kidnap hundreds of children at once from schools under the unwitting noses of staff.
This doesn't even begin to touch on the wildly different ways one can end up on the sex offender registry - abusing a child all the way to peed in an alley to mooned people from a bus to his girlfriend was a year younger & the parents screamed DO SOMETHING!!!!!!!!!
Wouldn't this money have been better spent on active shooter drills & teaching kids how to apply pressure to gunshots to save lives??
[ link to this | view in thread ]
Re:
Or here's a novel idea, maybe use the money on paying teachers more and buying school supplies so the teachers don't have to provide them themselves?
I know, it's far fetched. What school district would do such a thing?
[ link to this | view in thread ]
Re: Re:
Children are sooo important!!!!! Thats why we pay those who educate them so shittily & pay administrators who sign these contracts way more than they are worth.
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: And you want to let GOOGLE keep data on everyone?
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
To think or not to think, but the machine told me so!
This incident also points out the lack of critical thinking traing in our educational systems. If the person running the machine couldn't decide that two unlike photographs meant a machine error, they they shouldn't be able to get a license to walk, let alone a position of responsibility with any school system.
[ link to this | view in thread ]
When ego and profits are on the line
Mitchell still isn't allowed into the building without a security escort and is hoping that presenting school admins with his spotless criminal background check will finally jostle their apparently unshakeable belief in Raptor's search results.
The match was bogus and couldn't be repeated. Rather than admit that the system and the staff screwed up, they instead doubled down and are acting as though the system was right the first time.
There's pigheadedness, and then there's outright denial of reality, and I suspect I know why(beyond just a simple refusal to admit to having screwed up)...
The Sentinel asked government officials if there were any incidents in which sex offenders had gained access to schools, thus necessitating this $100,000+ investment in Raptor's security system. No results were returned.
Admitting that it and they were wrong here, blatantly so, could lead to people asking just what how good the product and 'procedures' involved are, and since that would require admitting that both are notably flawed, well, just easier to just pretend it was spotless all along, even if it leads to an innocent man being treated as a sex offender.
[ link to this | view in thread ]
A possible scenario...
A possible scenario to explain what happened: whoever was running the system realized that Larry Mitchell wasn't the person in the database. However, that person thought
[ link to this | view in thread ]
Re: A possible scenario...
[ link to this | view in thread ]
Re: When ego and profits are on the line
[ link to this | view in thread ]
Interesting and BACK to the 70's we go..
its from a book..
Thi is a great way to get people USED to the need/use/conspiracy of DATA collection and permanent ID.
1 more step to Orwell..
[ link to this | view in thread ]
https://en.wikipedia.org/wiki/County_(United_States)
So approx. 3,300 false positives/year. Has no one asked about # of false negatives/year?
[ link to this | view in thread ]
Precrime software
[ link to this | view in thread ]
Please please please
but seriosuly, how is list updated, what is process to check it, and can data be manipulated? Since data like that only as good as weakest link. And since just following orders apparently is okay (saracasm), well....
All I can think of is the earth 2140 thing. Where the computer overlord causes earth's destruction due to bad data!
[ link to this | view in thread ]
Re: Re: And you want to let GOOGLE keep data on everyone?
[ link to this | view in thread ]
It's sad to see that the same sort of airport-grade security that we've become accustomed to is now trickling down into public schools, apparently to prevent some sort of rare crime that's in all probability never happened before.
But when do other, more common crimes get added to the list? And how long before visiting parents, in addition to being ID'd, get radiated and body searched upon entering a school, in addition to all the children?
[ link to this | view in thread ]
Re: Precrime software
[ link to this | view in thread ]
Re: Re: When ego and profits are on the line
[ link to this | view in thread ]
Re: Re: Precrime software
You will be.
[ link to this | view in thread ]
Re: Re: Re:
The school administrators who sign these contracts need to be paid a king's ransom, because otherwise they'll quit and work for one of the companies who sold them some multi-millon dollar polished turd.
[ link to this | view in thread ]
Re: Re: When ego and profits are on the line
But a reasonable proxy is abductions. It's not an exact analogy, but it's somewhat informative. Children taken by strangers or slight acquaintances represent only one-hundredth of 1 percent of all reported missing children.
[ link to this | view in thread ]
“Safety is a top priority,” Christiansen said. “This was just a matter of us following our district protocol. Unfortunately the parent who showed up this morning declined the escort and left.”
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: Please please please
[ link to this | view in thread ]
Re:
Translation: 'He refused to play along with the whole 'I'm a sex offender' accusation and be paraded around with a visible escort to make it crystal clear that he could not be trusted.'
Yeah, can't imagine why he wasn't willing to go along with that, though nice of them to admit that it's apparently 'district protocol' to double-down on groundless accusations rather than admit fault.
[ link to this | view in thread ]
Re: Re: Re: When ego and profits are on the line
Number of people in your life that have been in positions of trust and you have withdrawn that trust.
I count one of those, but no cases have I heard of a stranger abusing someone. Publicly, lots of *catholic priests*. We all know of Larry Nasser. Ariel Castro is the only abduction case I can think of.
Plenty of ways to ensure there is no hanky panky at a school event; simply keeping an eye on someone specific will do that.
[ link to this | view in thread ]
It is better to claim a parent is a sex offender, than to run the risk of a lawsuit.
[ link to this | view in thread ]
Minimizing
I love that minimization. One false positive in the county...and just how many systems did they install in that county?
So why did they go by county? Well there are three counties in Aurora Colorado. How many false positives were there in Aurora? In each of its two school districts?
Far be it that they should discuss false positive rates, or mention how many false positives they had nationwide.
[ link to this | view in thread ]
Re
Well, first I'd be either laughing at the obviously erroneous data or congratulating whoever came up with the first time machine.
But, the data being stored is not a problem. It's what's done with the data that's the issue. Here, the issue was not the false flagging. The problem was the school acting on it even though 5 seconds of due diligence would have to told them it was not correct.
Similarly, Google having false information about me is not a problem. It's when the people you defend insist they have to automatically process that data in case I watch 2 seconds of video without them collecting a toll when it becomes problematic. It's when governments insist they have to become defect state censors because they can't be bothered to train or fund law enforcement properly when it becomes an issue.
Surely, even someone as rabidly committed to being an idiot as you are can see the difference?
[ link to this | view in thread ]
Re: To think or not to think, but the machine told me so!
So, we end up in dangerous ground, where people trust the machine even when it's obviously wrong, because it's so rarely in that state. People drive their cars off cliffs because the sat nav told them to, even though their own eyes tell them it's wrong.
At least in those cases we can laugh at the idiots, but when the person is a victim because other people are gullible morons it's less entertaining. Gilliam's Brazil and Kafka's The Trial are not places we want to people to actually be held in real life.
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: Interesting and BACK to the 70's we go..
[ link to this | view in thread ]
- A mass shooting at the school.
- Kids getting a lousy education because of underpaid teachers and lack of materials.
- A visit by a registered sex offender.
Apparently, the latter was deemed most harmful -- which might actually be correct, if you only consider the harm to the administration's reputation.
[ link to this | view in thread ]
Re: And you want to let GOOGLE keep data on everyone?
*insert slow clap here*
[ link to this | view in thread ]
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: Re: When ego and profits are on the line
So, really, they should've been using this system to ensure that only strangers are let into the events. Family members are just too risky.
[ link to this | view in thread ]
Re:
There's no requirement that anyone accuses you of having done anything.
[ link to this | view in thread ]
I don't understand this.
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
These people could be replaced with actual robots and I doubt anyone could tell the difference.
[ link to this | view in thread ]
Re: Minimizing
Based upon what "the system" is being used for, I would think the acceptable rate of failure would be zero.
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re:
Ten years, at most.
[ link to this | view in thread ]
Oh right, they're implementing that next year...
[ link to this | view in thread ]
Re: Re: Interesting and BACK to the 70's we go..
[ link to this | view in thread ]
Re: Re: Re: Precrime software
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
Re: Re:
Not really. There are two people mentioned that the "who" in "who is white" could be referring to, and one of them is explicitly described as a "black man". That kinda leaves only one possibility :)
[ link to this | view in thread ]
[ link to this | view in thread ]
You are a Slave to the Device and then You Die!
An automated system is only as good as its human backstop. If the humans making the final judgment call are incapable of using good judgment, the system is useless.
Therein lies the problem a lot of people relish the opportunity of ceding their own thought-process in favor of an algorithm or device doing the "thinking" (comment author used scare quotes to alert any potential readers that the word "thinking" is being used out of context as algorithms/devices do not think) for them.
Critical thinking and problem solving skills - whatever are they good for in rote memorization nation?
[ link to this | view in thread ]
Re: Re: Minimizing
[ link to this | view in thread ]
Wait a Cotton-Pickin Minute!
[ link to this | view in thread ]