South Korean Gov't Gave Millions Of Facial Photos Collected At Airports To Private Companies

from the PLEASE-DO-NOT-FEED-THE-ALGORITHMS dept

Facial recognition systems are becoming an expected feature in airports. Often installed under the assumption that collecting the biometric data of millions of non-terrorist travelers will prevent more terrorism, the systems are just becoming another bullet point on the list of travel inconveniences.

Rolled out by government agencies with minimal or no public input and deployed well ahead of privacy impact assessments, airports around the world are letting people know they can fly anywhere as long as they give up a bit of their freedom.

What's not expected is that the millions of images gathered by hundreds of cameras will just be handed over to private tech companies by the government that collected them. That's what happened in South Korea, where facial images (mostly of foreign nationals) were bundled up and given to private parties without ever informing travelers this had happened (or, indeed, would be happening).

The South Korean government handed over roughly 170 million photographs showing the faces of South Korean and foreign nationals to the private sector without their consent, ostensibly for the development of an artificial intelligence (AI) system to be used for screening people entering and leaving the country, it has been learned.

The agency carelessly handing out millions of facial images to private tech companies was the country's Ministry of Justice. Ironically enough, South Korean privacy activists (as well as some of the millions contained in the database) say this action is exactly the opposite of "justice."

While the use of facial recognition technology has become common for governments across the world, advocates in South Korea are calling the practice a “human rights disaster” that is relatively unprecedented.  

“It’s unheard-of for state organizations—whose duty it is to manage and control facial recognition technology—to hand over biometric information collected for public purposes to a private-sector company for the development of technology,” six civic groups said during a press conference last week.

The project -- one with millions of unaware participants -- began in 2019. The MOJ is in the process of obtaining better facial recognition tech to arm its hundreds of airport cameras with. To accomplish this, it apparently decided the private sector should take everything cameras had collected so far and use those images to train facial recognition AI.

The public was never informed of this by the Ministry of Justice. It took another government employee to deliver the shocking news. National Assembly member Park Joo-min requested information from the Ministry about its "Artificial Intelligence and Tracking System Construction" project and received this bombshell in return.

Maybe the government felt this was okay because most of the images were of non-citizens. This is from South Korean news agency Hankyoreh, which broke the story:

Of the facial data transferred from the MOJ for use by private companies last year as part of this project, around 120 million images were of foreign nationals.

Companies used 100 million of these for “AI learning” and another 20 million for “algorithm testing.” The MOJ possessed over 200 million photographs showing the faces of approximately 90 million foreign nationals as of 2018, meaning that over half of them were used for learning.

With two-thirds of the freebie images being of foreigners, perhaps the South Korean government thought it would lower its incoming litigation footprint. But that still leaves nearly 58 million images of its own citizens. And there's nothing preventing foreign citizens from suing the South Korean government, even though this action can sometimes be considerably more expensive than suing locally.

Lawsuits are coming, though, according to Motherboard.

Shortly after the discovery, civil liberty groups announced plans to represent both foreign and domestic victims in a lawsuit.

The legal basis for the collection isn't being challenged. It's the distribution of the collected images, which no travelers expressly agreed to. Precedent isn't on the government's side.

“Internationally, it is difficult to find any precedent of actual immigration data from domestic and international travelers being provided to companies and used for AI development without any notification or consent,” said Chang Yeo-Kyung, executive director of the Institute for Digital Rights.

It's pretty sad when democratic governments decide the people belong to the government, rather than the other way around. But as the march towards always-on surveillance continues in travel hubs and major cities, using members of the public as guinea pigs for AI development is probably going to become just as routine as the numerous, formerly-novel, impositions placed on travelers shortly after the 9/11 attacks.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: biometrics, facial recognition, privacy, south korea, surveillance


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 23 Nov 2021 @ 11:47am

    DMV clerk: I'm sorry sir, but our AI says that you are a foreigner named Nguyen. Renewing this license would be fraud. If you could ask Mr Kim to come in, we can get his picture taken and the license renewed, no problem.

    Mr Kim: but I'm ....

    DMV clerk: The computer is never wrong. NEXT!

    link to this | view in chronology ]

  • icon
    ECA (profile), 23 Nov 2021 @ 12:16pm

    and forign laws?

    What recourse do international laws give?
    Not very much.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.