ACLU Sues DOJ Over Facial Recognition Documents
from the catching-some-heat-from-backburnered-requesters dept
It's no secret the federal government is using facial recognition tech. The DHS wants to use it at all ports of entry (including airports) on pretty much every traveler. Amazon wants every government agency possible to buy its version of the tech, even as the company (and the agencies it hopes to supply) undergo Congressional investigations. And the FBI's facial recognition database has been growing steadily since 2014, outpacing required Privacy Impact Assessments and the FBI's willingness to vet the accuracy of its search tools.
The public would definitely like to know more about the government's use of biometric tracking, but the government's way less interested in talking about it. The ACLU filed a FOIA request in January seeking biometric/facial recognition documents held by the FBI and DEA. Those requests have been ignored for 10 months.
The ACLU is now suing these federal agencies. The feds' deafening silence echoes against a backdrop of enacted facial recognition bans in a handful of cities and one statewide ban on use of the tech in police body cameras. The lawsuit [PDF] points out both agencies refused to give the ACLU's request expedited processing and the DEA went so far as to grant itself a 10-day extension to respond. That 10-day period stretched into 60 days before the DEA sent its second response -- one that stated none-too-believably that the ACLU's request was "being handled as expeditiously as possible."
The complaint asks for a judge to order the immediate release of responsive records, an injunction preventing the agencies from charging the ACLU processing fees, and attorney fees in the event the ACLU wins its suit.
The attached FOIA request shows how much information is already in the public domain, which will make it very difficult for the feds to claim they don't have responsive documents. Facial recognition is the government's new kudzu. It's everywhere and it just keeps growing.
The FBI… operates the Next Generation Identification-Interstate Photo System, which a 2016 Government Accountability Office report described as “a face recognition service that allows law enforcement agencies to search a database of over 30 million photos to support criminal investigations.”
[...]
Amazon Web Services (AWS) provides cloud services for all 17 United States intelligence agencies, including the DOJ and its component agencies the FBI and DEA.5 According to recent media reporting, the FBI is testing Amazon’s Rekognition face recognition product, which is part of the suite of software products available on AWS, in a pilot program. In May 2018, the intelligence community awarded Microsoft a contract enabling all 17 agencies to use Azure Government, Microsoft’s cloud service for public entities. Microsoft, like Amazon, offers its customers a face surveillance product that runs on its own cloud service; Microsoft’s face surveillance product is called Face API. Additionally, as of 2015, the FBI has utilized NEC Corporation of America’s “Integra ID 5 biometric solution software,” which provides facial recognition capabilities, in conjunction with the agency’s NGI system.
Hopefully, the ACLU will succeed. Public records requests are pretty much the only way the public can access details about the government's surveillance plans and tech tools. The FBI and DEA aren't willing to discuss these openly but they're more than willing to deploy them against the same public these agencies insist have no business asking what the government's up to.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: doj, facial recognition, foia
Companies: aclu
Reader Comments
Subscribe: RSS
View by: Time | Thread
The major airport here handles about 130K passengers per day. At a 99.9% accuracy, facial recognition systems would make 130 errors each and every day. Almost all of them false positives. And that's assuming a 99.9% accuracy, something the current facial recognition systems can only dream of. In realistic tests, they are more likely to have an accuracy of around 80%, or 26000 mistakes per day.
I give the system about two days before the people running it start to ignore anything it flags unless they happen to want an excuse to stop and grope a particular individual.
[ link to this | view in chronology ]
Re:
So they will evolve into police?
[ link to this | view in chronology ]
Re:
And since you can't know how many are false negatives, the accuracy rate could be worse than 99.9% and you'd never know.
[ link to this | view in chronology ]
Re:
In other words good-looking women with good curves trying to board a plane will end up a disproportionate number in the TSA's frisking statistics.
False positives aren't rocket science so why ANYONE would put any hope at all in a system known to be more harmful than beneficial is beyond me.
[ link to this | view in chronology ]
Re: Re:
"In other words good-looking women with good curves trying to board a plane will end up a disproportionate number in the TSA's frisking statistics."
Sexist remark.
Not everyone thinks like you. Ask Stephen.
[ link to this | view in chronology ]
I got Groucho Marx glasses here!
2 for $10
[ link to this | view in chronology ]
So What ?
... if ACLU successfully extracted this information from Federal agencies -- so what !
We already know our Federal government is heavily engaged in mass surveillance -- but we citizens have been totally unable to even slow down its steady expansion.
We need a SOlUTION to the grave problem of government mass surveillance -- not more tedious legal filings seeking to marginally define the problem of mass surveillance.
[ link to this | view in chronology ]
Re: So yes
So in addition to legal filings, we can write articles about the problem that shed light on the abuse of power. And let our elected officials know that this we don't approve.
Like TIm is doing.
[ link to this | view in chronology ]
Re: Re: So yes
"we can write articles about the problem"
... such 'article-writing' has been going on for 25 years -- with no effect on the growth of mass surveillance. It's a futile gesture.
your Congressman & U.S. Senators have done nothing to restrict mass surveillance... and are quite content with the status quo.
[ link to this | view in chronology ]
Re: Re: Re: So yes
Well you can help, you can complain, or you can start digging a bunker.
[ link to this | view in chronology ]
Tracking people everywhere they go will result in those records being the subject of a subpoena. With that in mind, will each picture of your face be saved in the records for verification or will it be deleted and replaced with Waldo Was Here flag?
[ link to this | view in chronology ]
Re:
Passively tracking and recording everyone using facial recognition technology would be a massive waste of time and resources.
The facial recognition just marks positive hits for further investigation and would only need to indicate a time stamp in a log. It doesn't have to serve as "proof" that someone was there or who they are in the sense that you are thinking.
[ link to this | view in chronology ]
Re: Re:
"positive hits for further investigation"
and this investigation would of course include verification that the image taken is indeed the person you think it is. I imagine that holding the image up next to the person you think it represents would be something an investigator would be doing.
[ link to this | view in chronology ]
What are the objections to using facial recognition at international airports? Travelers don't have a right to conceal their identities, why does it matter if their faces are scanned?
[ link to this | view in chronology ]
Re:
There's a fuzzy line between a legal amount and an illegal amount of surveillance. At a certain point the tracking requires a specific warrant for every person being monitored.
[ link to this | view in chronology ]
Re: Re:
But they're not talking about tracking people. It is specifically to scan the faces of people attempting entry into the United States, not the general public. It even says in the article they want to use it on "travelers". There is no right to conceal your identity while you're attempting entry into a country via international airport. There's no reasonable privacy expectation.
Sure there will be false positives and errors, just as there have been with every new investigatory or interrogation technique ever devised. That doesn't mean we should stop trying new things.
[ link to this | view in chronology ]
Re: Re: Re:
One of the ACLU's big causes is the opposition to overlapping but individual identity checks/scans ect... forming a giant surveillance web and tracking most of people's lives.
That kind of tracking can be inferred to have been found unconstitutional in a recent Supreme Court decision involving tracking a car.
[ link to this | view in chronology ]
Re: Re: Re:
"But they're not talking about tracking people"
"Sure there will be false positives and errors"
"just as there have been with every new investigatory or interrogation technique ever devised."
"That doesn't mean we should stop trying new things."
[ link to this | view in chronology ]
Re: Re: Re: Re:
What does any of what you said have to do with facial recognition being used for identity verification at customs entry points in international airports?
"They're not doing bad stuff with it... yet" isn't a compelling argument when Trump uses it, it's not any more compelling coming from you.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
Ummm - how about the fact that it does not work. Why deploy a system that is well known for its false positives? Do they not care?
How many false positives will result in the pursuit of innocent people for crimes they had nothing to do with? How much will it cost these innocent people to defend themselves?
Just like the so called "private" DNA databases are now being used for purposes not originally agreed to, what makes you think this situation is any different? Get real.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
You're acting like being flagged as a false positive will somehow result in people being sent immediately to prison for crimes they didn't commit. It's just one point of data to determine your identity and if you are a person of interest, no different or invasive than being asked to present a passport.
Multiple scans can eliminate many false positives already, and the tech is getting better. Rolling it out in a limited but highly-visible fashion, in a place where it makes perfect sense to use it, is exactly how they should be trying to implement new security technology.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re:
You could be pulled out of the boarding queue and held for hours as they argue you are using false papers, especially as the TSA hire people who are likely to believe the computer over the person in front of them, especially if they are not white.
Alternatively, you are mistakenly identified as a target of a man hunt, and shot before they verify your identity.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re:
So things will be exactly the same as they are already? And this tech has exactly the same problems as all other forms of security, so demonizing it as inherently evil without even seeing how it will be used is pretty stupid huh?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re: Re: Re:
"So things will be exactly the same as they are already?"
No, currently a mistake or suspicion by the TSA means some inconvenience.
A false positive from a facial recognition algorhitm, otoh, means some perfectly innocent people will be pointed out to be terrorists likely to carry explosives who need to be shot on sight for the safety of the surrounding people.
Tech which is guaranteed to cause great harm and no good should not be rolled out. Spending the money on more trained security experts would be the better option by far.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re:
You sound like a sales person, do you have a financial stake?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re:
"You're acting like being flagged as a false positive will somehow result in people being sent immediately to prison for crimes they didn't commit."
When the likely misidentification links an innocent person with someone who is high enough on the NSA list of "dangerous and wanted" to merit inclusion?
That black dude from Chicago over there who just got tagged as "Abu Bakr the Mad Bomber of Afghanistan" won't get sent to prison. He'll be catching bullets the very second he tries to pick up his suitcase or starts using his smartphone.
[ link to this | view in chronology ]
Re: Re: Re:
"Sure there will be false positives and errors, just as there have been with every new investigatory or interrogation technique ever devised. That doesn't mean we should stop trying new things."
Really?
You don't see an issue with hundreds of perfectly innocent travelers being flagged by a system as terrorists, child abusers, wanted murderers? In a nation which currently has a terrifying history of trigger-happy police and rent-a-cops?
When a system is guaranteed from the start to cause great harm and little to no good then that is a system you simply shouldn't invest in. Especially when for the same cost you could have a hundred more trained security officers in the airport.
[ link to this | view in chronology ]
Re: identity
do you know the detailed "identity" of everyone you see at an airport, shopping mall, or just walking down a street ?
A face does not reveal identity ... unless specifically correlated with other data.
The problem is that facial recognition is part of a larger mass surveillance system that effectively treats all persons as suspects in past/current/future crimes.
We all end up being tracked by the government, most all of the time.
Government tracking of everyone is a means to control everyone ... and is a hallmark of an authoritarian government.
Does living under an authoritarian government bother you at all ?
[ link to this | view in chronology ]
Re: Re: identity
Yes, the government does or should know the identity of everyone that attempts entry into the US at an international airport. We're talking about facial recognition scanning at customs entry points, a place where you are already required by law to truthfully reveal your identity. There is no expectation of anonymity when you are attempting to enter a country.
[ link to this | view in chronology ]
Re:
"What are the objections"
Given: Eventually, these records will be used in court to determine something.
Question: Will the record be archived such that the picture used in generating the facial match is preserved for verification purposes or will images be deleted after being "matched"? With the well known failure rate of this technology, it should not be deployed and certainly not used as a simple bread crumb db.
[ link to this | view in chronology ]
Re: Re:
Why would the image need to be stored or used in court? DHS doesn't need anything like probable cause to investigate someone attempting entry to the US, the attempted entry gives them all the authority they need to investigate and confirm the identity of an international traveler.
[ link to this | view in chronology ]
Re: Re: Re:
Perhaps this issue is a tad larger than a customs entry point.
afaik, facial recognition is being deployed at locations other than customs entry points - and why not?
Like I said previously, false positives are a huge problem.
[ link to this | view in chronology ]
Re: Re: Re: Re:
Yes, the issue is much larger, but I'm asking specifically about the DHS plan to use facial recognition technology at customs entry points, as pointed out by this article as though it were a bad thing.
Facial recognition technology has legitimate uses in a customs entry point, so I'm asking what is the problem with using it there, but all I am getting back is vague assertions that facial recognition tech and government are bad no matter what.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
"I'm asking specifically about the DHS plan to use facial recognition technology at customs entry points, as pointed out by this article as though it were a bad thing."
Yes, It is a bad thing. Let me repeat myself ... it does not work.
How much is the tax payer being charged for this equipment that does not work? Why deploy something that does not work? Why base your security upon something that does not work?
I think what your question should be is ... If facial recognition worked, then why is it a bad thing. Amirite?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
Nobody is talking about "basing your security" on facial recognition. It's one single point of data, just like the passport you have to provide and the questions you have to answer.
Since your objection is that it doesn't work, would you be fine with it being used if it did work?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re:
"Since your objection is that it doesn't work, would you be fine with it being used if it did work?"
So you agree that should be the question?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re:
Neither the passport nor the question you mention has the ability to erroneously link your face to a "MAD BOMBER! SHOOT ON SIGHT!" security alert issued by an intelligence agency.
If TSA personnel gets suspicious on you you get a patdown, a luggage search, and a slew of further questions.
When facial recognition tech misidentifies you what you get is a bullet through your skull.
[ link to this | view in chronology ]
can someone please sensibly explain why it is that every government everywhere wants/needs this information on totally innocent citizens? if all people were criminals and/or terrorists, and had been proven as such, it would be a bit different but when the info being scooped up is on law-abiding, tax paying, honest living citizens, it doesn't make much sense. it's going back 75+years when a world war was fought to stop a particular government in a particular country wanting to know everything about all it's citizens with the intention of annihilating a particular race and terrorizing all the rest. why has the world's governments now decided to do the very thing that was fought against so vehemently for so long, with guarantees given at the end of the conflict being thrown out the window now? it's almost as if those people in the government i'm speaking about have infiltrated every other government worldwide and the desire then is manifesting itself again but with much more viciousness because of the technology available today. whatever is going on, it's a very bad thing and i sincerely hope it can be brought to a suitable end in the very near future! wiping out millions of people so a few can get what they desire, control of everyone and everything that would be left, is not something (being a slave) i care to contemplate!
[ link to this | view in chronology ]
Re:
This is the next brick in the wall...
Every US Citizen violates at least one law a day, if not 5-8... normally this isn't relevant as nobody is 'catching' everyone for minor violations (speeding, j walking, etc).
Add a system that films, recognizes, and identifies the individuals, the next step is to start issuing citations (of the verbal morality code...) or tracking a 'citizen score'.
Suddenly everyone who isn't one of the watchers are criminals who are charged based on what cameras saw then do, regardless of who else is around and if nobody was hurt.
Welcome to China, all your bases are belonging to us...
[ link to this | view in chronology ]
Although people seem to want to lump all large tech companies into the same swamp, (e.g., Amazon, Google, Facebook, Twitter, etc.), I think we should be making at least some distinction based on how companies react to request by the government for things like facisl recognition and other encroachments on privacy.
In my opinion, although all clearly do this to some extent, Amazon is clearly the worst offfender. If there is a buck to be made, Amazon clearly does not care from whom the money flows nor how what they are willing to develop and sell will affect anyone. I personally will never buy anything through Amazon, regardless of any price differential. Freedom is never the cheapest solution whether the price is monetary in nature or some other cost is imposed.
Every purchase through Amazon is support for government surveillance at its worst. Just know that if you buy through Amazon, you have no standing to complain about the infringement on privacy that Amazon is happy to support.
[ link to this | view in chronology ]
Re:
I like Amazon
[ link to this | view in chronology ]