Both Things Are True: Press Freakouts Over Facebook's Practices Have Been Misleading & Facebook Has A Privacy Problem
from the facebook-derangement-syndrome dept
And so we're back with Facebook Derangement Syndrome. As we've noted a few times in the past, many of the freakouts about Facebook's privacy practices involve completely misunderstanding or exaggerating the nature of what Facebook did -- and presenting things not just in the worst possible light, but in an actively misleading way. This is especially true in the context of privacy questions, where many people seem to interpret Facebook's good decisions not to lock down YOUR OWN access to your own data as a bad thing and then pressure the company to lock up access to your own data, limiting what you can do with it.
Of course, there is some amount of inherent conflict between open systems and privacy. Indeed, going back eleven years, we had a post highlighting the potential privacy conflicts of Facebook's "open social graph." And, of course, at the time, Facebook was celebrated for being so open and not locking up everyone's data, but enabling it to be used more widely in other systems.
And that brings us to this week's big NY Times story on Facebook. As we already discussed, what it really highlighted is what a terrible job Facebook does in being open and transparent about how it uses data. But we were also left with some questions about some of the claims in the NYT report, especially regarding the claims that other companies had access to messages.
As more people have looked at it, it increasingly appears that the NY Times reporting on this was really, really bad and contributed to the hysteria, rather than improving understanding. The companies that had access to Facebook messages involved software integrations where those third party apps allowed you to directly access Facebook Messenger from those apps -- in the same way that if you want to use Facebook Messenger on your mobile phone, you have to give that phone access to your messages so that... you can use FB Messenger.
As Mathew Ingram notes in an article about this, early on, many people rightfully celebrated Facebook's open approach, which involved the opposite of locking down data, but purposefully exposing it to make the rest of the internet more useful. It was the kind of openness and open integration most people used to celebrate. It was the opposite of building a locked box silo of your data.
Will Oremus, over at Slate, further notes that the integrations Facebook is now being slammed for in the Times were ones that people were happy about in the past, though, perhaps naively.
The companies’ Facebook integrations simply allowed existing customers to log into their Facebook accounts from within the streaming app and use its messaging features without having to navigate to Facebook proper. It’s the sort of arrangement that looks foolhardy or even sinister today but that many internet users took for granted at the time.
I know that because I was one of them. I thought nothing of using Facebook to log into Spotify, because I naïvely trusted Facebook to guard my data, probably more so than I trusted Spotify. I even tested for a while a Mozilla Firefox feature that brought a Facebook feed directly into your browser, as a sidebar, so that you could see what your friends were up to even when you were on other websites. It eventually dawned on me that this was imprudent, and certainly there were some activists at the time who were sounding alarms, but it was hardly a scandal.
None of that is to say that Facebook doesn't have serious problems. As I wrote when the NY Times piece first came out, the company seems to trip over itself to be sneaky and combative in explaining all of this, and it has always done a terrible job of transparently explaining how the data is and can be used.
But we should be focusing on the real issues regarding our privacy online, rather than cooking up bogus issues to argue about. When we focus on the wrong things, inevitably, whatever "solution" is proposed will make things much, much worse.
And, again, there are real issues here. Facebook letting Amazon look at who you know to determine whether or not reviews are allowed... that's a problem. No one was told about that. And that wasn't just about creating integrations to help users do something. That was a questionable sharing of information with a corporate partner, without user permission.
So, we should be able to admit that Facebook has a real privacy problem (and perhaps an even bigger transparency/honesty problem), without immediately jumping on every conspiracy theory about Facebook, when many are not actually accurate.
Filed Under: data, integrations, journalism, openness, privacy, reporting, silos
Companies: facebook