Mark Zuckerberg Finally Speaks About Cambridge Analytica; It Won't Be Enough
from the more-to-be-done dept
It took way too long, but Mark Zuckerberg finally responded to the Cambridge Analytica mess on Wednesday afternoon. As we've discussed in a series of posts all week, this is a complex issue, where a lot of people are getting the details wrong, and thus most of the suggestions in response are likely to make the problem worse.
To be clear, Mark's statement on the issue is not bad. It's obviously been workshopped through a zillion high-priced PR people, and it avoids all the usual "I"m sorry if we upset you..." kind of tropes. Instead, it's direct, it takes responsibility, it admits error, does very little to try to "justify" what happened, and lists out concrete steps that the company is taking in response to the mess.
We have a responsibility to protect your data, and if we can't then we don't deserve to serve you. I've been working to understand exactly what happened and how to make sure this doesn't happen again. The good news is that the most important actions to prevent this from happening again today we have already taken years ago. But we also made mistakes, there's more to do, and we need to step up and do it.
It runs through the timeline, and appears to get it accurately based on everything we've seen (so no funny business with the dates). And, importantly, Zuckerberg notes that even if it was Cambridge Analytica that broke Facebook's terms of service on the API, that's not the larger issue -- the loss of trust on the platform is the issue.
This was a breach of trust between Kogan, Cambridge Analytica and Facebook. But it was also a breach of trust between Facebook and the people who share their data with us and expect us to protect it. We need to fix that.
The proactive steps that Facebook is taking are all reasonable steps as well: investigating all old apps prior to the closing of the old API to see who else sucked up what data, further restricting access to data, and finally giving more transparency and control to end users:
First, we will investigate all apps that had access to large amounts of information before we changed our platform to dramatically reduce data access in 2014, and we will conduct a full audit of any app with suspicious activity. We will ban any developer from our platform that does not agree to a thorough audit. And if we find developers that misused personally identifiable information, we will ban them and tell everyone affected by those apps. That includes people whose data Kogan misused here as well.
Second, we will restrict developers' data access even further to prevent other kinds of abuse. For example, we will remove developers' access to your data if you haven't used their app in 3 months. We will reduce the data you give an app when you sign in -- to only your name, profile photo, and email address. We'll require developers to not only get approval but also sign a contract in order to ask anyone for access to their posts or other private data. And we'll have more changes to share in the next few days.
Third, we want to make sure you understand which apps you've allowed to access your data. In the next month, we will show everyone a tool at the top of your News Feed with the apps you've used and an easy way to revoke those apps' permissions to your data. We already have a tool to do this in your privacy settings, and now we will put this tool at the top of your News Feed to make sure everyone sees it.
That's mostly good, though as I explained earlier, I do have some concerns about how the second issue -- locking down the data -- might also limit the ability of end users to export their data to other services.
Also, this does not tackle the better overall solution that we mentioned yesterday, originally pitched by Cory Doctorow to open up the platform not to third party apps that suck up data, but to third party apps that help users protect and control their own data. That part is missing and it's a big part.
If you already hated Zuckerberg and Facebook, this response isn't going to be enough for you (no response would be, short of shutting the whole thing down, as ridiculous as that idea is). If you already trusted him, then you'll probably find this to be okay. But a lot of people are going to fall in the middle and what Facebook actually does in the next few months is going to be watched closely and will be important. Unless and until the company also allows more end-user control of privacy, including by third party apps, it feels like this will fall short.
And, of course, it seems highly unlikely that these moves will satisfy the dozens of regulators around the world seeking their pound of flesh, nor the folks who are already filing lawsuits over this. Facebook has a lot of fixing to do. And Zuckerberg's statement is better than a bad statement, but that's probably not good enough.
Meanwhile, as soon as this response was posted, Zuckerberg went on a grand press tour, hitting (at least): CNN, Wired, the NY Times and Recode. It's entirely possible he did more interviews too, but that's enough for now.
There are some interesting tidbits in the various interviews, but most of what I said above stands. It's not going to be enough. And I'm not sure people will be happy with the results. In all of the interviews he does this sort of weird "Aw, shucks, I guess what people really want is to have us lock down their data, rather than being open" thing that is bothersome. Here's the Wired version:
I do think early on on the platform we had this very idealistic vision around how data portability would allow all these different new experiences, and I think the feedback that we’ve gotten from our community and from the world is that privacy and having the data locked down is more important to people than maybe making it easier to bring more data and have different kinds of experiences.
But, of course, as we pointed out yesterday (and above), all this really does is lock in Facebook, and make it that much harder for individuals to really control their own data. It also limits the ability of upstarts and competitors to challenge Facebook. In other words, the more Facebook locks down its data, the more Facebook locks itself in as the incumbent. Are we really sure that's a good idea? Indeed, when Wired pushes him on this, he basically shrugs and says "Well, the people have spoken, and they want us to control everything."
I think the feedback that we’ve gotten from people—not only in this episode but for years—is that people value having less access to their data above having the ability to more easily bring social experiences with their friends’ data to other places. And I don’t know, I mean, part of that might be philosophical, it may just be in practice what developers are able to build over the platform, and the practical value exchange, that’s certainly been a big one. And I agree. I think at the heart of a lot of these issues we face are tradeoffs between real values that people care about.
In the Recode interview, he repeats some of these lines, and even suggests (incorrectly) that there's a trade-off between data portability and privacy:
“I was maybe too idealistic on the side of data portability, that it would create more good experiences — and it created some — but I think what the clear feedback from our community was that people value privacy a lot more.”
But... that's only true in the situation where Facebook controls everything. If they actually gave users more control and transparency, then the user can decide how her data is shared, and you can have both portability and privacy.
One other interesting point he raises in that interview: we should not be letting Mark Zuckerberg make all the decisions about what is and what is not okay:
“What I would really like to do is find a way to get our policies set in a way that reflects the values of the community, so I am not the one making those decisions,” Zuckerberg said. “I feel fundamentally uncomfortable sitting here in California in an office making content policy decisions for people around the world.”
“[The] thing is like, ‘Where’s the line on hate speech?’ I mean, who chose me to be the person that did that?,” Zuckerberg said. “I guess I have to, because of [where we are] now, but I’d rather not.”
But, again, that's a choice that Facebook is making in becoming the data silo. If the end users had more control and more tools to control, then it's no longer Mark's choice. Open up the platform not for "apps" that suck up users data, but for tools that allow users to control their own data, and you get a very different result. But that's not where we're moving. At all.
So... so far this is moving in exactly the direction I feared when I wrote about this yesterday. "Solving" the problem isn't going to be solving the problem for real -- and it's just going to end up giving Facebook greater power over our data. That's an unfortunate result.
Oh. And for all of the apologizing and regrets that Zuckerberg raises in these interviews, he has yet to explain why if this is such a big deal, Facebook threatened to sue one of the journalists who broke this story. It seems like that's an important one for him to weigh in on.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: damage control, data portability, dominance, mark zuckerberg, privacy, silos
Companies: cambridge analytica, facebook
Reader Comments
Subscribe: RSS
View by: Time | Thread
"It's not going to be enough."
They cannot get on the wrong side of people: too many people are far too complacent and stupid for the rest to be of significance. They might worry about getting on the wrong side of governments that could block their access to the complacent and stupid people, even if for stupid reasons.
[ link to this | view in thread ]
"We have a responsibility to protect your data, and if we can't then we don't deserve to serve you."
So, he is finally admitting to the world that no one should use Facebook. This is an almost inevitable endpoint of the ideology that users of Facebook, Gmail, etc are not the customers. They are the product.
[ link to this | view in thread ]
Cambridge Analytica would be unable to do business with Facebook, because they would no longer be able to find a company to underwrite their bond.
[ link to this | view in thread ]
"Facebook threatened to sue one of the journalists...."
[ link to this | view in thread ]
So who is in charge again? The citizens in all of these "faux" democracies or the regulators?
"seeking their pound of flesh"
Indeed... indeed!
[ link to this | view in thread ]
Re: "It's not going to be enough."
No, they might worry about governments blocking their access to money, but what government would actually block Facebook?
[ link to this | view in thread ]
Zuckerberg is lying, of course
[ link to this | view in thread ]
Re: Re: "It's not going to be enough."
Blocking Facebook means blocking a potential (or actual) revenue stream. That's not a typical activity of most governments. They're more interested in fines that generate short-term revenue and deals that generate longer-term revenue.
As always, it's all about the money.
[ link to this | view in thread ]
Re:
Cambridge Analytica should never have had access to the data in the first place. There would be nothing to bond if privacy was respected.
[ link to this | view in thread ]
Re: Zuckerberg is lying, of course
[ link to this | view in thread ]
[ link to this | view in thread ]
How has Facebook partnered with political campaigns historically?
What access was granted to Obama campaigns? How did the campaigns use their access to that data?
Are other tech companies sharing data sets with political campaigns? How are those data sets being used?
Eric Schmidt drafted campaign strategy email for 2016, including description of voter scores. The voter scores were to be calculated based on data sets from Google, Comcast, and other companies.
Were those voter scores ever calculated? How were they used to drive campaign decisions? Are those voter scores being used for any other purpose?
[ link to this | view in thread ]
What about Target getting hacked? What about Transamerica getting hacked? What about our government getting hacked. All of these gave up a hell of a lot more personal information that Facebook even has. SSNs, credit information, buying information.
If it didn't involve Trump, no one would care, but it did involve Trump, so now everyone is up in arms?
[ link to this | view in thread ]
Re:
Give me a break. This has nothing to do about privacy and everything to do about Trump. Privacy? So Facebook sells information based on what people put on their profile that they know becomes public information?
I take you've read literally none of the other articles I wrote about this in which I addressed all those points?
[ link to this | view in thread ]
Re: Re: Zuckerberg is lying, of course
How do you figure? Sociopathy is as simple an explanation as malice, and it's known to be common among CEOs.
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: Re:
Typical of today, where a few sexist comments are equal to actual rape.
[ link to this | view in thread ]
Whoops.
[ link to this | view in thread ]
Re: Re: Re:
[ link to this | view in thread ]
Re: Re: Re:
[ link to this | view in thread ]
[ link to this | view in thread ]
Evidence
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Ownership of data
[ link to this | view in thread ]
Fixed that for them
[ link to this | view in thread ]
Re: Fixed that for them
They have no interest in protecting it, only selling it.
I really don't understand why people keep saying this. FACEBOOK DID NOT AND DOES NOT SELL YOUR DATA. The more you say this false thing the more you actually help Facebook, because all it does is encourage the kind of nonsense in the post here, where Zuckerberg gets to pretend he's "forced" to lock down YOUR data.
There is no selling of data here.
[ link to this | view in thread ]
Re: Re: Re: Re:
[ link to this | view in thread ]
Re: Re: Re: Re: Re:
By contrast, Cambridge Analytica et al. apparently told people they were engaged in a personality survey, and when people opted in to that survey, pulled data unrelated to the survey without telling the subjects about it.
The difference is transparency, and for all the quite legitimate and deserved bashing the Obama administration gets over "most transparent administration in history", that does IMO make all the difference.
[ link to this | view in thread ]
Re: Bond?
Anyone who has a problem with CA's actions but not with Zuckerberg's own is either hopelessly naive or a hypocrite.
[ link to this | view in thread ]
Re: Re: Fixed that for them
[ link to this | view in thread ]