Texas School District's Facial Recognition System Capable Of Capturing A Single Student's Image More Than 1,000 Times A Week
from the Panopticon-ISD dept
Facial recognition tech is making its way into schools, subjecting minors to the same tech that still hasn't proven its worth in the adult world. Like many other surveillance encroachments, this acquisition and deployment was prompted by violence and fear.
Alfred Ng of The Markup has obtained documents detailing a system in use in some Texas schools, one acquired as a potential answer to a uniquely American existential threat.
The school district originally purchased AnyVision after a mass shooting in 2018, with hopes that the technology would prevent another tragedy. By January 2020, the school district had uploaded 2,967 photos of students for AnyVision’s database.
This seemingly small number only covers a few months of use by the Santa Fe (Texas) Independent School District. Prior to its official rollout, the district ran a short test using an even larger set of photos.
With more than 5,000 student photos uploaded for the test run, AnyVision called the results “impressive” and expressed excitement at the results to school administrators.
“Overall, we had over 164,000 detections the last 7 days running the pilot. We were able to detect students on multiple cameras and even detected one student 1100 times!” Taylor May, then a regional sales manager for AnyVision, said in an email to the school’s administrators.
It seems like the perfect tool to track students individually as they go about their school day. The 1,100 hits for a single student is considered a sign of the software's effectiveness, something administrators seem unhealthily enthusiastic about.
And that's not all the district did with its new tech. It also lent it to local law enforcement officers who were trying to identify a suspected drug dealer they believed was a student. The school's cops contacted AnyVision which uploaded the provided photos and ran it against images captured by its system.
The company says the software is watchlist-based, limiting users to targeting persons of interest. Private retailers use it to keep shoplifters and other criminals out of stores. Schools can target suspected sex offenders by using publicly available mugshots from law enforcement databases.
But this trial run suggests the district isn't interested in limiting itself to watchlists. The user guide [PDF] obtained by The Markup makes it clear the software can log all faces that pass by equipped cameras. It's this kind of broad collection that enables more than 100,000 "detections" in a single week and allows students to be pinpointed 1,000 times during a five-day school week.
There are controls available in the software, but it's not clear the district is using any of them.
The software offers a “Privacy Mode” feature in which it ignores all faces not on a watchlist, while another feature called “GDPR Mode” blurs non-watchlist faces on video playback and downloads. The Santa Fe Independent School District didn’t respond to a request for comment, including on whether it enabled the Privacy Mode feature.
As for AnyVision, it says hey, it's just the provider.
“We do not activate these modes by default but we do educate our customers about them,” AnyVision’s chief marketing officer, Dean Nicolls, said in an email.
This lack of clarity or comment suggests the district is using the tech for all-purpose surveillance rather than utilizing it in the targeted fashion idealized in public statements and marketing brochures. The public statements by the district's board claim there's no invasion of privacy here, just another layer of protection. While the first part of the statement may be technically true given the lack of an expectation of privacy on school grounds, the latter is much more difficult to quantify as a justifiable offset for always-on surveillance.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: anyvision, facial recognition, schools, surveillance, texas
Reader Comments
Subscribe: RSS
View by: Time | Thread
A surveillance system can at best deter criminal actions, but when that fails it can only provide the data to work out what happened after the event. When someone remote to the scene sees a student entering a classroom with a gun, the best they can do is call the police and ambulances to deal with the tragedy that is about to unfold.
[ link to this | view in chronology ]
Re:
I'm still confused - the system IDs only students. If there isn't a specific "identify" list, how is it supposed to alert on the legitmate student with a weapon - or any non student?
Just how does tracking every movement prevent another shooting?
[ link to this | view in chronology ]
Re: Re:
It's not supposed to. The best you could hope for is that the average edgelord is put off by the appearance of security. Actually determined students are not going to be functionally deterred. As for agents from outside the school, you can forget it.
[ link to this | view in chronology ]
Re:
The problem is, that "someone remote to the scene" is not going to see the gun in the student's hand in time to do anything in real time, because it is an event that happens so seldom. A gunshot detection system, without the facial recognition system, would be faster and more reliable for the purpose of sending a warning when a gun is fired. ... assuming the gunshot itself wasn't sufficient warning.
And a gunshot detection system isn't going to send the school into lockdown when Rando the Bearded Stranger comes to the school to fix the plumbing.
[ link to this | view in chronology ]
I think it's pretty morally reprehensible for the administration to be opening including minors in their involvement of the latest sex fetish.
[ link to this | view in chronology ]
Putting innocent students into law enforcement databases for the rest of their lives, because freedom! Freedom of the makers of broken, intrusive technology to profit from the public sector and kids.
[ link to this | view in chronology ]
I know way back in the day there was superstition that photographs could steal peoples souls. But it's kind of surprising to see school administration subscribing to such things (A surveillance system is, by definition passive, in that it isn't detaining people. so by itself it can't stop anything except possible the loss of the students likeness).
Also, this seems ripe for abuse. Like learning habits about students that make them easier to exploit. Of course we all know school administration and teachers are above reproach and would never do anything to harm students /s .
[ link to this | view in chronology ]
Re:
It's an abuser's wet dream if they are a staff member.
Learn where the dead zones are, set up an alert to monitor that pretty cheerleader. Use the data gathered over the space of a few weeks to determine when and which deadzones she frequents, who is usually with her and when she is alone.
It's an automated stalking tool if it doesn't have the proper oversight and control baked into it.
[ link to this | view in chronology ]
I see Cory Doctorow's Shitty Technology Adoption Curve is back with a vengeance.
[ link to this | view in chronology ]
Re:
Didn't he put this very thing in Little Brother?
[ link to this | view in chronology ]
The technology fixes anything myth
The stated reason:
"The school district originally purchased AnyVision after a mass shooting in 2018, with hopes that the technology would prevent another tragedy. "
mission creep: ID sex offenders.
mission creep: Police given photo.
CYA - Vender: As for AnyVision, it says hey, it's just the provider.
CYA - Vender: watchlist-based, limiting users to targeting persons of interest.
Well, the new nanny is The Machine" as seen on TV, Person of Interest (CBS). The agency using all seeing machine to catch bad guys before car bombs, nerve gas, etc. killed hundreds of people. Works like a champ in the fun world of Science Fiction.
School administrators are making a leap of faith that AnyVision can stop NextBadShooting. Are they going to design a method that says -- Student 42 is troubled and has 90% chance of doing something bad.
Some other school student(s) have hinted something is up were ignored because -- teenagers are unpredictable, clannish, and getting adults involved in their games never works for good or bad. (Sorry, "Breakfast Club")
At this point there is only one reason AnyVision is giving us the wink. Real Life test facility. School is going to be embarrassed when PTB say get rid of it, don't commit to purchase anything, and Judge declared Facial Recognition Systems (some opinion about Constitution, and 2k words on doing their job.)
[ link to this | view in chronology ]
Re: The technology fixes anything myth
"... Student 42 is troubled and has 90% chance of doing something bad ..."
I believe the Pasco County Sheriff in Florida can help them with that part...
[ link to this | view in chronology ]
Re: Re: The technology fixes anything myth
"... Student 42 is troubled and has 90% chance of doing something bad ..."
I believe the Pasco County Sheriff in Florida can help them with that part...
You mean the Pasco County Sheriff Recruiting Department, on the lookout for their next batch of inductees?
[ link to this | view in chronology ]
This makes no sense though...
How is this system supposed to stop a school shooting when the shooter is a student wihtout and indication they were a threat?
[ link to this | view in chronology ]
Re: This makes no sense though...
Shhhhh! Because shut up, that's how.
[ link to this | view in chronology ]
Re: This makes no sense -- except as astro-turfing!
Mariusz: Jul 23rd, 2021 https://www.techdirt.com/user/mariusz803 -- new "account" among first 8 comments with a one-liner...
[ link to this | view in chronology ]
Re: Re: This makes no sense -- except as astro-turfing!
Been reading for a while, and yes I just recently made an account. What's your point?
[ link to this | view in chronology ]
Privacy mode blah blah limited blah blah watchlist.
And how, pray tell, does one make it onto the "watchlist"? Does anyone else (the target, parents, helpful, not authoritarian, people) get to know? Do we do anything about whatever traits or indicators lead one to ve placed upon a watchlist other than watch them?
[ link to this | view in chronology ]
Re:
All students are on the watchlist. At all times. The sole criterion is to be a student as all students are inherently dangerous.
In order to be fair, going forward, all staff members, and eventually everyone in the world (except me and those i trust (max 2 at any time)) will be on the list.
[ link to this | view in chronology ]
Re: Re:
lol. That would mean a max of 3 people would be blurred out at any one time... which I think would be the opposite of privacy. you need > (1/2)total-population to be not on the watch list in order for it to have decent privacy.
Of course the whole idea is based on insanity, so I doubt minor details like that would deter anyone.
[ link to this | view in chronology ]
Re: Re: Re:
The import point about not being on the watchlist is that the software doesn't then identify and track, you are essentially invisible to the system.
You don't think I'm going to allow anyone I don't trust to watch the video - this is all about automation of tracking and punishment. Oh, did I say that punishment bit out loud?
[ link to this | view in chronology ]
Funny how they aren't talking about how many teachers or administrators they managed to capture...
[ link to this | view in chronology ]
detected one student 1100 times
They say in the last 7 days, but assuming the student is at school for the standard 5 days a week, somewhere between 6 and 8 hours a day, that's:
220 detections per day
27-37 detections per hour
1 detection approx every 2 minutes
Would this camera be in their classroom ( assuming the student studies in the same room all day, which isn't always the case)?
It's like a street camera detecting a parked car 50 times a second.
"This camera system is so advanced it can detect a car 4 320 000 times a day"
Of course, sufficient cameras could detect a moving student, and once the "target" is aquired, it's much easier to track incremental movement. Imagine how this could be used to ensure "segregation" by making sure a certain student isn't interacting outside of their social status.
[ link to this | view in chronology ]
A camera on the wall
We don't need constant surveillance
We don't need watchlist control
No wall mount cameras in the classroom
Please turn on Privacy Mode
Hey! Texas! Leave them kids alone!
AnyVision's got a camera on the wall.
Anyvision detects what you do in the hall.
[ link to this | view in chronology ]
Anyone who didn't see this coming a mile away hasn't been paying attention. Minors, especially teenagers, are seen as subhuman and no degree of surveillance, control, or punishment is seen as 'too much.' This system will be used to follow all possible targets at once, movement maps will be built up, a machine-learning system will be used to attempt to 'anticipate misbehavior', kids will be harrassed pre-emptively because the black box system says they're going to do something bad. It will be a self-fulfilling prophecy just like cameras and ID badges and existing "security" technology is. They will never evaluate it for effectiveness. They will never measure its performance. They will never notice that shootings double in the schools that deploy it. (How often do you hear about the fact that all school shootings since and including Columbine have happened at schools that already HAVE cameras?) The mentality is "better safe than sorry" with a healthy dose of not even thinking about the possibility that well-intentioned actions will directly make the school a more student-hostile environment that will push kids over the edge. When a kid decides to annihilate themselves and take out as many as they can on their way down, it's because they feel trapped and see no other way out. The solution is not 'uh oh it looks like some kids in this state find some other way to deal, let's shut those down too'. But damn does it sell some equipment and software and services. The more times the oppression causes people to crack, the more parents demand expanded oppression.
[ link to this | view in chronology ]
Notably absent is the part where they go through the 164,000 "detections" to see how many were false.
Detecting one student more than everyone else suggests an outlier caused by false positives rather than effectiveness.
[ link to this | view in chronology ]