Facebook's Latest Privacy Screwup Shows How Facebook's Worst Enemy Is Still Facebook
from the get-your-act-together dept
There's another Facebook scandal story brewing today and, once again, it appears that Facebook's biggest enemy is the company itself and how it blunders into messes that were totally unnecessary. When the last story broke, we pointed out that much of the reporting was exaggerated, and people seemed to be jumping to conclusions that weren't actually warranted by some internal discussions about Facebook's business modeling. The latest big scandal, courtesy of a big New York Times story, reveals that Facebook agreed to share a lot more information than previously known or reported with a bunch of large companies (though, hilariously, one of those companies is... The NY Times, which The NY Times plays down quite a bit).
The social network permitted Amazon to obtain users’ names and contact information through their friends, and it let Yahoo view streams of friends’ posts as recently as this summer, despite public statements that it had stopped that type of sharing years earlier.
As Kash Hill notes in a separate story at Gizmodo, this suddenly explains a story she had explored years ago, where Amazon rejected a review of a book, claiming the reviewer "knew the author," (which was not true). However, the reviewer had followed the author on Facebook, and Amazon magically appeared to know about that connection even if the reviewer did not directly share her Facebook data with Amazon.
The NY Times report further explains another bit of confusion that Hill has spent years trying to track down: how Facebook's People You May Know feature is so freaking creepy. Apparently, Facebook had data sharing agreements with other companies to peek through their data as well:
Among the revelations was that Facebook obtained data from multiple partners for a controversial friend-suggestion tool called “People You May Know.”
The feature, introduced in 2008, continues even though some Facebook users have objected to it, unsettled by its knowledge of their real-world relationships. Gizmodo and other news outlets have reported cases of the tool’s recommending friend connections between patients of the same psychiatrist, estranged family members, and a harasser and his victim.
Facebook, in turn, used contact lists from the partners, including Amazon, Yahoo and the Chinese company Huawei — which has been flagged as a security threat by American intelligence officials — to gain deeper insight into people’s relationships and suggest more connections, the records show.
As Hill noted on Twitter, when she asked Facebook last year if it uses data from "third parties such as data brokers" to figure out PYMK, Facebook's answer was technically correct, but totally misleading:
In the summer of 2017, I asked Facebook if it used signals from "third parties such as data brokers" for friend recommendations. Kicking myself for not recognizing the evasion in their answer.
1. FB's answer via email
2 What the NYT found out about PYMK in internal FB docs pic.twitter.com/Sv2kwqsJ6u
— Kashmir Hill (@kashhill) December 19, 2018
Specifically, Facebook responded: "Facebook does not use information from data brokers for People You May Know." Note that the question was if Facebook used information from "third parties" and the "data brokers" were just an example. Facebook responded that it didn't use data brokers, which appears to be correct, but left out the other third parties from which it did use data.
And this is why Facebook is, once again, its own worst enemy. It answers these kinds of questions in the same way that the US Intelligence Community answers questions about its surveillance practices: technically correct, but highly misleading. And, as such, when it comes out what the company is actually doing, the company has completely burned whatever goodwill it might have had. If the company had just been upfront, honest and transparent about what it was doing, none of this would be an issue. The fact that it chose to be sneaky and misleading about it shows that it knew its actions would upset users. And if you know what you're doing will upset users, and you're unwilling to be frank and upfront about it, that's a recipe for disaster.
And it's a recipe that Facebook keeps making again and again and again.
And that's an issue that goes right to the top. Mark Zuckerberg has done too much apologizing without actually fixing any of this.
One bit in the NY Times piece deserves a particular discussion:
Facebook also allowed Spotify, Netflix and the Royal Bank of Canada to read, write and delete users’ private messages, and to see all participants on a thread — privileges that appeared to go beyond what the companies needed to integrate Facebook into their systems, the records show. Facebook acknowledged that it did not consider any of those three companies to be service providers. Spokespeople for Spotify and Netflix said those companies were unaware of the broad powers Facebook had granted them. A Royal Bank of Canada spokesman disputed that the bank had any such access.
Spotify, which could view messages of more than 70 million users a month, still offers the option to share music through Facebook Messenger. But Netflix and the Canadian bank no longer needed access to messages because they had deactivated features that incorporated it.
This particular issue has raised a lot of alarm bells. As Alvarao Bedoya points out, disclosing the content of private communications is very much illegal under the Stored Communications Act. But, the NY Times reporting is not entirely clear here either. Facebook did work hard for a while to try to turn its Messenger into more of a "platform" that would let you do more than just chat -- so I could see where it might "integrate" with 3rd party services to enable their features within Messenger. But the specifics of how that works would be (1) really, really important, and (2) should be 100% transparent with users -- such that if they're agreeing to, say, share Spotify songs via Messenger, they should absolutely be told that this means Spotify has access to whatever they have access to. A failure to do that -- as appears to be the case here -- is yet another braindead move by Facebook.
Over and over and over again we see this same pattern with Facebook. Even when there are totally reasonable and logical business and product decisions being made, the company's blatant unwillingness to be transparent about what it is doing, and who has access to what data, is what is so damning for the company. It is a total failure of the management team and until Facebook recognizes that fact, nothing will change.
And, of course, the most annoying part in all of this is that it will come back to bite the entire internet ecosystem. Facebook's continued inability to be open and transparent about its actions -- and give users a real choice -- is certainly going to lead to the kinds of hamfisted regulations from Congress that will block useful innovations from other companies that aren't so anti-user, but which will be swept up in whatever punishment Facebook is bringing to the entire internet.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: data sharing, mark zuckerberg, people you may know, privacy, tracking
Companies: amazon, facebook, netflix, new york times, spotify
Reader Comments
The First Word
“I think what you actually meant to say is "yet another example of willfully malicious behavior by a company that was built from the ground up on exactly that sort of behavior".
Subscribe: RSS
View by: Time | Thread
You view total surveillance capitalism as a mere PR problem.
I think it's unacceptable and that the revolt point is closer than you believe. We The People don't have to allow Facebook to exist at all. (Which, incidentally is what you clowns who believe that corporations have absolute right to control "platforms" always overlook. All can easily change overnight. Literally all we have to do revise the FICTION of corporations.)
[ link to this | view in chronology ]
Re: You view total surveillance capitalism as a mere PR problem.
Corporate fictions aren't why this happened. A lack of understanding of the public is why this happened.
[ link to this | view in chronology ]
Re: Re: You view total surveillance capitalism as a mere PR prob
Standard fare ... or more like standard unfair.
You see, the public needs to be clairvoyant as this clearly could have been avoided by the public.
[ link to this | view in chronology ]
Re: Re: Re: You view total surveillance capitalism as a mere PR
[ link to this | view in chronology ]
Re: Re: Re: Re: You view total surveillance capitalism as a mere
Makes sense.
[ link to this | view in chronology ]
Re: Re: Re: You view total surveillance capitalism as a mere PR
[ link to this | view in chronology ]
Re: Re: Re: Re: You view total surveillance capitalism as a mere
[ link to this | view in chronology ]
Re: Re: You view total surveillance capitalism as a mere PR prob
if they had been transparent, people could have made that choice, and the policies could have been crushed under the weight of protest or departure. Facebook may not have risen to prominence.
Exactly this.
[ link to this | view in chronology ]
Re: Re: Re: You view total surveillance capitalism as a mere PR
I don't have a Facebook presence but I developed a client for FB in 2013. To test it I had to create an account. I supplied my first name and zero other data. I even gave them a bogus birthday. During the course of testing I friended a member of my family so that I could send and receive messages and make sure the scopes I used provided the data I needed.
Shortly thereafter I started getting PYMK suggestions for people I actually knew who I had worked with years before, even before Facebook was available to the public and long before I created that test account. My relationships with those people ended before FB was available to the public. There is zero chance Facebook knew about them on their own, without outside data.
That invasiveness has prevented me from creating a real FB account as well as an account on any other social media. Apart from FB's inferred relationships with people I have spoken to in a very long time (and the NSA and other 3-letter fed orgs, of course) I have zero attributable footprint on the net. It's just way too damn creepy and there is apparently no way to opt out of it.
[ link to this | view in chronology ]
Re: Re: Re: You view total surveillance capitalism as a mere PR
As the company matured, I continued to see zero informative answers to the very fundamental questions I had raised at the beginning, other than some handwavy "acceptable use" promises that weren't technically feasible at actually protecting anything.
Eventually they brought on people I knew to form a security team (when was that? 2013?) but they still hadn't directly answered any of the fundamental architecture questions surrounding data security: the new security team appeared to be in place to ensure that the door swung shut after the horse bolted.
So Facebook may not have been transparent, but they were transparently evasive from the beginning. I may grudgingly accept that from a politician, but I'm not going to accept it from any government employee or private company I do business with.
Thing is, if Facebook had been transparent from the start, it really might not have made much difference until the situation was abused. People would just excuse it with "sure, that's POSSIBLE, but they'd never actually do that" followed by "well, they actually did it, but it wasn't intentional, and they've promised never to do it again."
If people are getting something they consider to be of value, they're willing to give up quite a bit in order to retain that value.
This is why I don't like cloud infrastructure... the provider has to continually give you what you want in order for you to retain anything of value.
[ link to this | view in chronology ]
Fundamental Architecture Questions
This is the crux of *any* mass market platform on the internet. You get lots of data on lots of people, in a conveniently standardized form, which is a moral hazard, a temptation. Facebook is just the demon du jour; we could go as far as including Google, the US credit bureaus, the IRS, and even Techdirt, which I like -- remember when "blue" complained about someone not posting for a few years??
Such information will always be a temptation for abuse. The hazard is increased possibly to the point of certainty, when "You" are the product and not paying the bills directly.
Diversity and heterogeneity isn't really a solution. Nobody is going to have Mastodon data formats #1 through #N just to keep privacy, it is just too much work, and if it is public, there will always be "bad actors" out there scraping it.
I'm not sure exactly how the credit bureaus do it, but I think there are some measures that can work to make the internet work a bit more like we might expect.
That expectation: The degree of customization is at the endpoint's discretion, and the more specific the customization, the more transparency required. If I say my name on Techdirt is Blue, then it is Blue, not the user from 29.112.258.4.
Some ideas:
*Generic browsing with poor fingerprinting properties. No, server, you may *not* know what browser/OS/fonts/machine name/screen size/whatever I am using at the moment.
*Forgetting all details except final results, and sunsetting unused data.
*An attitude that says "Well, you want to target Techdirt/Facebook/Daily Stormer users with property X?" Fine, but we won't tell you who those users are, we will tell the users what property X was, and we'll make a report available on all such transactions.
[ link to this | view in chronology ]
Re: You view total surveillance capitalism as a mere PR problem.
And destroy any business larger than a partnership, all churches,oh and congress itself. Corporations exist to allow property in particular to be owned by am organization, rather than by the people running it.
[ link to this | view in chronology ]
Re: You view total surveillance capitalism as a mere PR problem.
[ link to this | view in chronology ]
Re: You view total surveillance capitalism as a mere PR problem.
[ link to this | view in chronology ]
Re: You view Trolls as a moderation problem
Are you specifically promoting armed revolt today? And claiming once again that corporations shouldn't exist because you don't like them?
[ link to this | view in chronology ]
This is unacceptable.
Read, Write, delete privileges given to anyone with enough money. This is something that should really be divulged in huge font where it will not be missed, otherwise it is fraudulent and possibly criminally so.
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
The second sentence directly contradicts the first. If they're doing things that they know will upset users, then it would be an issue if the users knew about it, which is why Facebook chose to be sneaky and evasive instead.
The real problem is the blatant privacy-violating they engaged in. Compared to that, trying to hide it is a much smaller offense IMO.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
I think what you actually meant to say is "yet another example of willfully malicious behavior by a company that was built from the ground up on exactly that sort of behavior".
[ link to this | view in chronology ]
Re:
I think what you actually meant to say is "yet another example of willfully malicious behavior by a company that was built from the ground up on exactly that sort of behavior".
I think it would both be a mistake to believe this and lead to very bad outcomes to believe this. The company is most certainly not willfully malicious. Most people working there really do believe that they're making good choices for their users and building better services that are useful too them. Where they fall down (repeatedly) is in deciding for those users what will be "best" and part of that is driven, stupidly, by a focus on "growth" over "value."
A smarter company recognizes that value over time leads to growth. Growth over value does the opposite. Over time. It's all about time horizons.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re: Re:
I don't know that I'd say "willfully malicious", but I would say amoral, and probably myopic. "Move fast and break things" may not be the official motto anymore, but it's nonetheless a pretty good summary of the company's ethos.
What's the old saying? Better to beg forgiveness than ask permission?
[ link to this | view in chronology ]
Re: Re: Re: Re:
Right, malice is the wrong way to think about it. You are the chum to the Facebook shark.
[ link to this | view in chronology ]
Re: Re:
[ link to this | view in chronology ]
Re: Re:
Big Al Capone was not a criminal but engaged in the entertainment business.
Facebook has from day one violated the 4th and 5th amendment and is just as much a criminal organization as Big AL was for violating the 18th amendment.
[ link to this | view in chronology ]
Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re:
The right of the people to be secure in their persons, houses, papers, and effects,[a] against unreasonable searches and seizures, shall not be violated, and no Warrants shall issue, but upon probable cause, supported by Oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.
https://en.wikipedia.org/wiki/Fourth_Amendment_to_the_United_States_Constitution
There is nothing that I can see in that wording that limits the application of the 4th amendment to government.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
Now there's a novel legal theory! And if exceptio probat regulam in casibus non exceptis, then it would seem Facebook can issue arrest warrants if it has probable cause. Interesting...
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
the first ten amendments, known collectively as the Bill of Rights, offer specific protections of individual liberty and justice and place restrictions on the powers of government
Where does that mention companies and corporations and their powers. Those are thing that the government can regulate by the means of passing laws, so long as the government does not exceed the bounds set by the constitution.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
https://yourstory.com/2014/08/bengal-famine-genocide/
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
Ah, you're doing that thing where you read the literal text of the Constitution, remove it from the context of (1) the circumstances under which it was written and (2) the past 200 years of case law interpreting it, and act like you're making an argument that is very clever.
In the words of David St. Hubbins, "There's such a fine line between stupid and clever."
[ link to this | view in chronology ]
Re: most certainly not willfully malicious
[ link to this | view in chronology ]
Re: Re:
Yes, it is.
"Most people working there really do believe that they're making good choices for their users and building better services that are useful too them."
Most people working at Facebook have no clue what the executive level is doing.
It's comparable to people working AT&T customer service and having to listen to complaints of customers as their bills go up again since AT&T knows competition is non-existent.
It seems the NYT has confidential information knowing certain "customers" have access to data many other businesses do not, indicating a certain price point is met to obtain this information (or an agreed relationship).
"You miss the point, that if they had been transparent, people could have made that choice, and the policies could have been crushed under the weight of protest or departure. Facebook may not have risen to prominence."
He didn't miss the point. You did. The majority of Facebook users couldn't care less about the information shared. Many claim "I ain't got nothing to hide" as a justification to get that latest user-specific feed article into their eyeballs.
Those few who understand what's at stake are trying to raise the alarm, but it's going to be unheard over the laughter at imbeciles mocking another meme or tweet as their privacy is eroded.
In engineering, there's a natural fact even Einstein could not argue: it's impossible to fix stupid.
As much as I hate to say it, users are more responsible for the issues of Facebook than the company itself.
These people allow phones to track their every movement, bring microphones into their homes (and actually set them up for use), and don't even bother using a VPN to protect their online history.
Facebook is 100% malicious because it's taking advantage of this willfully.
Do not ever defend this company again.
[ link to this | view in chronology ]
Facebook is too big to jail
So what? Facebook is too big to jail. It won't even be prosecuted.
18 USC 2702 may say…
But the code there is just saying that. It doesn't mean really mean it. That provision code is inoperative when it comes to Facebook. The code is a dead letter.
It will not be applied.
Just watch. The code only applies to the less wealthy.
[ link to this | view in chronology ]
Re: Facebook is too big to jail
[ link to this | view in chronology ]
Re: Re: Facebook is too big to jail
[ link to this | view in chronology ]
The only thing more predictable than Mark Zuckerberg getting caught violating people's trust once again is seeing Mike Masnick trying to defend him against justifiably angry Techdirt commenters.
[ link to this | view in chronology ]
Your boilerplate warnings about how "hamfisted regulations" will block "useful innovations" ring hollow. It's very similar to the rhetoric that pro-ISP industry groups use to defend money-grubbing ISPs and Ajit Pai's dismantling of pro-consumer regulations. Facebook and other social media companies like it have shown time and again that they don't care about their users. They need to be brought to heel with regulatory power.
[ link to this | view in chronology ]
Re:
As a result, it will take a sustained effort by thinkers and policymakers to develop the conceptual basis needed for scalable, efficient regulation, not a "one and done" grandstanding bill hacked together from lobbyist input, emotional impulses, and misunderstandings. Thankfully, this ball is already rolling, starting with Balkin's work on information fiduciaries, but more refinement will be needed to develop it into something Congress can use, and pushing for Congress to "do something!" in the meantime simply leads to clumsy, buggy lawmaking, rife with unintended consequences.
[ link to this | view in chronology ]
A thought that occurs to me - is this sort of the problem they had with Cambridge?
They had no real idea how much access they got out of the partnership because FB only tells them about what they want to know. Why build controlled defined channels when you can just give them the firehose of data & tell them how to grab the info they want. No one would look at the rest of the stream to see what else they had access to, they are the good guys they are giving us data, we give them data and everyone is happy!
[ link to this | view in chronology ]