School Security Software Decides Innocent Parent Is Actually A Registered Sex Offender
from the you-can't-argue-with-(search)-results dept
An automated system is only as good as its human backstop. If the humans making the final judgment call are incapable of using good judgment, the system is useless.
School personnel allowed a machine to do all of their critical thinking, resulting in this unfortunate turn of events.
Staff in an Aurora school office mistakenly flagged a man as a registered sex offender when he and his family went to his son's middle school for a recent event.
Larry Mitchell said he was humiliated Oct. 27 when Aurora Hills Middle School office staff scanned his driver license into a software system used to screen visitors to Aurora Public Schools district schools.
The system, provided by a private company, flagged Mitchell as a potential match with a registered sex offender in a nation-wide database. Staff compared Mitchell’s information with the potential match and determined that match was correct, even though there are no offenders in the national sex offender registry with his exact name and date of birth.
Not only did these stats not match, but the photos of registered sex offenders with the same name looked nothing like Larry Mitchell. The journalists covering the story ran Mitchell's info through the same databases -- including Mitchell's birth name (he was adopted) -- and found zero matches. What it did find was a 62-year-old white sex offender who also sported the alias "Jesus Christ," and a black man roughly the same age as the Mitchell, who is white.
School administration has little to say about this botched security effort, other than policies and protocols were followed. But if so, school personnel need better training… or maybe at least an eye check. Raptor, which provides the security system used to misidentify Mitchell, says photo-matching is a key step in the vetting process [PDF].
In order to determine a False Positive Match the system operator will:
i. Compare the picture from the identification to the picture from the database.
ii.If the picture is unclear, we will check the date of birth, middle name, and other identifying information such as height and eye color.
iii. The Raptor System has a screen for the operator to view and compare photos.
iv. If the person or identifying characteristics are clearly not from the same person, the person will then be issued a badge and established procedures will be followed.
Even if you move past the glaring mismatch in photos (the photos returned in the Sentinel's search of Raptor's system are embedded in the article), neither the school nor Raptor can explain how Raptor's system returned results that can't be duplicated by journalists.
Mitchell said he was adopted, and his birth name is Lawrence Michael Evans. The Sentinel did not find a match with that or his legal name and date of birth in the national sex offender registry.
Raptor says its system is reliable, stating it only returned one false positive in that county last year. (And now the number has doubled!) That's heartening, but that number will only increase as system deployment expands. Raptor's self-assessment may be accurate, but statements about the certainty of its search results are hardly useful.
The company's sales pitch likely includes its low false positive rate, which, in turn, leads school personnel to believe the system rather than the person standing in front of them -- one who bears no resemblance (physical or otherwise) to the registry search results. Mitchell still isn't allowed into the building without a security escort and is hoping that presenting school admins with his spotless criminal background check will finally jostle their apparently unshakeable belief in Raptor's search results.
This failure is also an indictment of the security-over-sanity thinking. The Sentinel asked government officials if there were any incidents in which sex offenders had gained access to schools, thus necessitating this $100,000+ investment in Raptor's security system. No results were returned.
Neither local school or state public safety or education officials could point to data showing how many registered offenders try to seek access to schools, or if a registered offender visiting a school has ever harmed a student in Aurora or Colorado.
Given this history, Raptor's system is always going to be better known -- at least at this school -- for locking out non-criminals than catching sex offenders trying to be somewhere they shouldn't. If the schools haven't seen activity that necessitates the use of this system, it will always produce more false positives than actual hits. When there's no one to catch, you're only going to end up stigmatizing innocent parents. It's a lot of money to pay for solving a problem that doesn't exist. The school has purchased a tiger-proof rock and somehow managed to hurt someone with it.
Filed Under: ai, aurora hills middle school, fasle positive, larry mitchell, schools, sex offenders
Companies: raptor