New York Schools Putting Students In The Crosshairs Of Tech That Targets Minorities, Thinks Broom Handles Are Guns

from the if-at-first-you-don't-succeed,-victimize-minors-again dept

We're turning over discipline of school kids to cops and their tech and it's just making existing problems even worse. We've seen the problems inherent in facial recognition tech. And it's not just us -- this so-called leftist rag (according to our anonymous critics). It's also the National Institute of Standards and Technology (NIST). Its study of 189 facial recognition algorithms uncovered why most legislators seem unworried about the surveillance creep:

Middle-aged white men generally benefited from the highest accuracy rates.

When systems pose no risk to you personally, it's unlikely you'll object to rollouts of unproven AI and questionable tech. If it only hurts people who aren't you or your voter base, any incremental increase in law enforcement "effectiveness" is viewed as an acceptable tradeoff.

Destroying the lives of minorities has never been a major legislative concern. But if we all agree children are our future, it seems insanely destructive to turn a blind eye to the havoc this tech can create. Unless, of course, legislators believe only white children can secure the future (give or take 14 words). Then it's OK, even when it definitely isn't.

Documents obtained by Motherboard show few people care about minorities, no matter what government position they hold. Vetting contractors should be the first check against abuses. But it appears no one involved with regulating the lives of students (who are legally obligated to attend schools that view them as criminals) cares what happens to the minors they subject to the racist tendencies of law enforcement agencies and the tech they deploy.

Ever since they learned that Lockport City School District intended to install a network of facial recognition cameras in its buildings, parents in the upstate New York community—particularly families of color—have worried that the new system will lead to tragic and potentially fatal interactions between their children and police.

Now, documents newly obtained by Motherboard accentuate those fears. They show that SN Technologies, the Canadian company contracted to install Lockport’s facial recognition system, misled the district about the accuracy of the algorithm it uses and downplayed how often it misidentifies Black faces. The records, comprising hundreds of pages of emails between the district and the company, also detail numerous technical issues with SN Technologies’ AEGIS face and weapons detection system, including its propensity for misidentifying objects like broom handles as guns.

Wonderful. The collective shrug of legislators is feeding kids to racist tech with a proven track record of being unable to identify criminal suspects. This one goes a step further. It's unable to detect weapons accurately, which is probably why cops think it works great. Cops can't seem tell a cellphone or a Wii controller from a gun, so whatever justifies the use of force is an acceptable tradeoff for… um… not deploying force, I guess. So, when lives are actually on the line, cops will be chasing down broom handles being held by minorities, rather than weapons held by white people, who are far more likely to engage in school shootings.

The New York State Education Department (NYSED) stands by its approval of this questionable tech… sort of. Lockport officials have refused to comment. So has the police department making use of it. And so has their chosen facial recognition vendor, SN Technologies, which provides the AEGIS tech.

It's not like they didn't have any warning that the tech was faulty. Lockport officials received an email that discusses AEGIS's accuracy and propensity for aggravating racial biases. The AI finished 49 out of 139 respondents in the NIST's test for racial bias. But even that weak finish was overstated. As the NIST pointed out, the algorithm submitted by SN Technologies (which licenses its algorithm from French firm id3 Technologies) wasn't the same one that's being deployed in New York schools.

[A]ccording to Patrick Grother, the NIST scientist who oversaw the testing, the agency never tested an id3 Technologies algorithm that matches the description Flynn gave Lockport officials. “Those numbers don’t tally with our numbers. They’re not even close. What id3 sent to NIST is not what these people are talking about,” Grother told Motherboard.

The documents obtained by Motherboard show something even more nefarious than the submission of an algorithm that didn't actually represent what the company sold to clients. It appears SN Technology lied to school officials about the NIST's test results, claiming the algorithm was nearly twice as accurate as NIST testing actually showed.

But that hasn't stopped the rollout of facial recognition tech that disproportionately misidentifies minorities and/or their non-weapons. Other schools -- some in other states -- seem to believe faulty tech is better than no tech at all, especially if there's a chance the next false positive could prevent a school brooming.

At least 11 other districts in the state have since applied for Smart Schools money to purchase facial recognition systems, according to a NYCLU analysis of the applications. Schools in other states, such as South Carolina, have also deployed similar systems which claim the ability to detect weapons and stop school shootings.

We'll see if the spread of terrible tech slows in the future. Facial recognition is currently the target of lawsuits and legislation in New York. But if past performance is any indicator of future results, the tech isn't going to go away, no matter how poorly facial recognition tech, you know, recognizes faces.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: facial recognition, new york, schools, surveillance


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    That One Guy (profile), 3 Dec 2020 @ 9:48am

    Makes perfect sense

    What could possibly reduce school shootings more than taking a class of people who are already discriminated against and piling on even more discrimination and harassment, treating them like criminals and making clear that those in authority not only don't see a problem with that but will actively encourage it?

    link to this | view in chronology ]

  • identicon
    Annonymouse, 3 Dec 2020 @ 10:16am

    The company name being similar to that of a rather infamous Quebec engineering firm piqued my curiosity. So a quick search showed that it was less than a dozen people in Gannaniqe Ontario. It's where Quebec companies go to not be sussed out as Quebec companies as far as I have been able to surmise The big give away was reselling French software, a resume full of government contracts and a period of market valuation of pennies on the dollar.

    The real tech hubs are the cities North of Toronto and the Guelph-Waterloo region.

    link to this | view in chronology ]

  • identicon
    Kitsune106, 3 Dec 2020 @ 10:31am

    Hnnn

    If good enough for schools should also be used for cops and the like.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 3 Dec 2020 @ 3:11pm

    Destroying the lives of minorities has never been a major legislative concern.

    I don't know, it generally seems pretty high priority for most legislation.

    Oh, you meant a concern about not doing that, taking "concern" in what should be a positive context one imagines is obvious to... maybe half the population. Maybe.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.