China Starts Using Facial Recognition-Enabled 'Smart' Locks In Its Public Housing
from the just-wait-until-they-know-your-citizen-score-too dept
Surveillance using facial recognition is sweeping the world. That's partly for the usual reason that the underlying digital technology continues to become cheaper, more powerful and thus more cost-effective. But it's also because facial recognition can happen unobtrusively, at a distance, without people being aware of its deployment. In any case, many users of modern smartphones have been conditioned to accept it unthinkingly, because it's a quick and easy way to unlock their device. This normalization of facial recognition is potentially bad news for privacy and freedom, as this story in the South China Morning Post indicates:
Beijing is speeding up the adoption of facial recognition-enabled smart locks in its public housing programmes as part of efforts to clamp down on tenancy abuse, such as illegal subletting.
The face-scanning system is expected to cover all of Beijing's public housing projects, involving a total of 120,000 tenants, by the end of June 2019
Although a desire to stop tenancy abuses sounds reasonable enough, it's important to put the move in a broader context. As Techdirt reported back in 2017, China is creating a system storing the facial images of every Chinese citizen, with the ability to identify any one of them in three seconds. Although the latest use of facial recognition with "smart" locks is being run by the Beijing authorities, such systems don't exist in isolation. Everything is being cross-referenced and linked together to ensure a complete picture is built up of every citizen's activities -- resulting in what is called the "citizen score" or "social credit" of an individual. China said last year that it would start banning people with "bad" citizen scores from using planes and trains for up to a year. Once the "smart" locks are in place, it would be straightforward to make them part of the social credit system and its punishments -- for example by imposing a curfew on those living at an address, or only allowing certain "approved" visitors.
Even without using "smart" locks in this more extreme way, the facial recognition system could record everyone who came visiting, and how long they stayed, and transmit that data to a central monitoring station. The scope for abuse by the authorities is wide. If nothing else, it's a further reminder that if you are not living in China, where you may not have a choice, installing "smart" Internet of things devices voluntarily may not be that smart.
Follow me @glynmoody on Twitter or identi.ca, and +glynmoody on Google+
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: china, facial recognition, privacy, public housing, smart locks, surveillance
Reader Comments
Subscribe: RSS
View by: Time | Thread
Seems bad all around, and doesn't accomplish the goal either.
Then there's the idea of curfews, if I get home late I have to spend the night on the street? Or I invite some acquaintance over, not knowing their 'social score' and find they can't get in? Or worse, meet them at the door an let a 'socially unacceptable' person in, not knowing (or maybe knowing) they are 'socially unacceptable' (probably meaning anti-government control, but could be actually bad)?
So this is just another step in the government taking control over the population, at least until the population decides it doesn't want that control anymore. There are an awful lot of them.
[ link to this | view in chronology ]
Re: Seems bad all around, and doesn't accomplish the goal either
If you're a citizen with a good social credit score there is next to no reason to doubt the system. In fact many "model citizens" operate under the indoctrinated belief that if you're considered morally upright, you have nothing to fear. Never mind that failing the system is as easy as doing something in the grey area such as exposing corruption. And the system is inherently designed to make comebacks impossible.
It's out_of_the_blue's wet dream.
[ link to this | view in chronology ]
That'll learn 'em
[ link to this | view in chronology ]
All I want is to get in my F---g apartment, take some aspirin and antihistamine, put an icepack on, and sleep for a week.
And now the F---g door won't let me in.
Great.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
The capabilities of this system do not stop at the doorway to the homes in question; they extend out into the apartment hallways and roads the doors face, too.
[ link to this | view in chronology ]
Look on the bright side!
[ link to this | view in chronology ]
There was a story (that Techdirt reported on I believe) where facial recognition thought that 36 then current members of congress (out of 435) were matches with criminals already sitting behind bars. That's an 8% failure rate, which is scary high for a lock on your door.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Criminals "Behind Bars"
Which begs the question what the correct % to flag to be is(and what was in the mugshot collection), because it seems unlikely to me that no one the whole of Congress has ever been arrested and booked.
Also it's a different kind of error than the Congress example(false positive vs false negative), which may have a different error rate entirely.
[ link to this | view in chronology ]
Oh, wait - I think that is the point - they want it to screw with the lives of the minions because it's fun watching them scurry around.
On a side note, I recall a story about a major city in China with a pollution problem so bad that it interfered with their facial recognition city wide spy system - and this was the only reason they actively tried to clean up the air.
[ link to this | view in chronology ]
Culturally ethics in practice differs from what is seen in the west. I see constant effort to game the agreement or the rules of society. This plays out in our observation of delivery of steel or other products that may not meet agreed upon specs or include some hidden defect. We see this when tutoring exchange students in US Universities, there is open agitating for advanced access to tests or other privileged information not made available by the professor rather than focus on the topics. We see this in the professional world where H1B employees tend to be more likely pad billable hours or other other performance related reports.
My expectation is that social credit turns into a game where everyone games the score at all times and methods develop to pay for improved score.
This assumption leads to interesting scenarios. Does China detect the gaming of the citizen score systems, and hold accountable those involved in the practice? Will selective enforcement become the norm?
If the system is applied honestly we'll see the frog notice the boil and an increase in social unrest. I'll be curious to observe how it plays out.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]