One More Time With Feeling: 'Anonymized' User Data Not Really Anonymous
from the we-can-see-you dept
As companies and governments increasingly hoover up our personal data, a common refrain to keep people from worrying is the claim that nothing can go wrong -- because the data itself is "anonymized" -- or stripped of personal detail. But time and time again, we've noted how this really is cold comfort; given it takes only a little effort to pretty quickly identify a person based on access to other data sets. As cellular carriers in particular begin to collect every shred of browsing and location data, identifying "anonymized" data using just a little additional context has become arguably trivial.
Researchers from Stanford and Princeton universities plan to make this point once again via a new study being presented at the World Wide Web Conference in Perth, Australia this upcoming April. According to this new study, browsing habits can be easily linked to social media profiles to quickly identify users. In fact, using data from roughly 400 volunteers, the researchers found that they could identify the person behind an "anonymized" data set 70% of the time just by comparing their browsing data to their social media activity:
"The programs were able to find patterns among the different groups of data and use those patterns to identify users. The researchers note that the method is not perfect, and it requires a social media feed that includes a number of links to outside sites. However, they said that "given a history with 30 links originating from Twitter, we can deduce the corresponding Twitter profile more than 50 percent of the time."
The researchers had even greater success in an experiment they ran involving 374 volunteers who submitted web browsing information. The researchers were able to identify more than 70 percent of those users by comparing their web browsing data to hundreds of millions of public social media feeds.
Of course, with the sophistication of online tracking and behavior ad technology, this shouldn't be particularly surprising. Numerous researchers likewise have noted it's relatively simple to build systems that identify users with just a little additional context. That, of course, raises questions about how much protection "anonymizing" data actually has in both business practice, and should this data be hacked and released in the wild:
"Yves-Alexandre de Montjoye, an assistant professor at Imperial College London, said the research shows how "easy it is to build a full-scale 'de-anonymizationer' that needs nothing more than what's available to anyone who knows how to code." "All the evidence we have seen piling up over the years showing the strong limits of data anonymization, including this study, really emphasizes the need to rethink our approach to privacy and data protection in the age of big data," said de Montjoye.
And this doesn't even factor in how new technologies -- like Verizon's manipulation of user data packets -- allow companies to build sophisticated new profiles based on the combination of browsing data, location data, and modifying packet headers. The FCC's recently-passed broadband privacy rules were designed in part to acknowledge these new efforts, by allowing user data collection -- but only if this data was "not reasonably linkable" to individual users. But once you realize that all data -- "anonymized" or not -- is linkable to individual users, such a distinction becomes wholly irrelevant.
One of the study's authors, Princeton researcher Arvind Narayanan, has been warning that anonymous data isn't really anonymous for the better part of the last decade, yet it's not entirely clear when we intend to actually hear -- and understand -- his message.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: anonymized data, anonymous, privacy
Reader Comments
Subscribe: RSS
View by: Time | Thread
Unreal
Obama refused to crack down on these activities adequately and has handed Trump tools and weapons that no president should have. I'd say that it is up to Congress now to do their jobs, but since they work for the corporations, and not the people, that isn't going to happen.
While I think that Obama did a number of good things, I sure wish that he had had the balls to actually heed the warnings that he was given. Like LBJ, (in regards to Vietnam), Obama was too worried about being called a pussy to do that right things about T.W.A.T. Now we all, Americans and the entire world, are going to be forced to deal with the consequences of his inaction.
I sure hope like hell that in 2020 Americans remember this kind of crap and put someone in the White House who isn't too cowardly or psychotic to do the right thing for the country. It's time we voted in people who KNOW that they work for the best interests of the people, not the best interests of the government, or the corporations.
[ link to this | view in thread ]
Oh Well
[ link to this | view in thread ]
About time
[ link to this | view in thread ]
[ link to this | view in thread ]
Re:
I do not think that everyone knows that tidbit, seems like unreasonably restrictive search criteria. Why is this the case while other identifying tags might provide equivalent results?
[ link to this | view in thread ]
Re: Re:
[ link to this | view in thread ]
How do we fight this?
Two possible routes (I am sure there are others):
The law. Allow for data to be marked, by the owner or source, as "anonymised" (whether any technical steps are taken or not) and make it a criminal offence to either (i) attempt to de-anonymise, or (ii) correlate such data with any other data. This should be enough to prevent (for example) insurance companies using such data to set premiums and it might even be enough to prevent major commercial data brokers from using the data (although steps would have to be taken to make sure investigation and penalties are severe enough to prevent data-washing, possibly abroad). Of course, it has no effect on governments, nor on commercial deals where the source is not willing to mark the data as "anonymised".
These two steps would also have to be accompanied by greater public awareness of de-anonymisation. The legal route is particularly important in making sure that companies cannot claim something is "anonymised" unless there are ways for the data subjects to actually enforce it.
[ link to this | view in thread ]
[ link to this | view in thread ]
[ link to this | view in thread ]
Differential Privacy
(Not the best wikipedia article; better to read Dwork's papers & watch her videos on YouTube.)
Given a number of queries, n, DP fuzzes the dataset in such a way that
[ link to this | view in thread ]
Re: Differential Privacy
It looks like putting in a "less than" character will end the comment.
I'm not going to waste any time providing high quality feedback if your web page arbitrarily truncates the comments.
[ link to this | view in thread ]
33 bits of entopy
[ link to this | view in thread ]
33 bits of entopy
[ link to this | view in thread ]
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]