from the it's-a-mess dept
In my last post, I described why it was wrong to focus on claims of Facebook "selling" your data as the "problem" that came out over the weekend concerning Cambridge Analytica and the data it had on 50 million Facebook users. As we described in detail in that post, that's not the problem at all. Instead, much of the problem has to do with Facebook's utter failure to be transparent in a way that matters -- specifically in a way that its users actually understand what's happening (or what may happen) to their data. Facebook would likely respond that it has tried to make that information clear (or, alternatively, it may say that it can't force users to understand what they don't take the time to understand). But I don't think that's a good answer. As we've learned, there's a lot more at stake here than I think even Facebook recognized, and providing much more real transparency (rather than superficial transparency) is what's necessary.
But that's not what most people are suggesting. For example, a bunch of people are calling for "Know Your Customer" type regulations similar to what's found in the financial space. Others seem to just be blindly demanding "oversight" without being able to clearly articulate what that even means. And some are bizarrely advocating "nationalizing Facebook", which would literally mean giving billions in taxpayer dollars to Mark Zuckerberg. But these "solutions" won't solve the actual issues. In that article about "KYC" rules, there's the following, for example:
“They should know who’s paying them,” said Vasant Dhar, a professor of information systems at New York University, “because the consequences are very serious.” In December, Dhar wrote an op-ed calling for social media regulation — specifically, something similar to the “know your customer” laws that apply to banks. “The US government and our regulators need to understand how digital platforms can be weaponized and misused against its citizens, and equally importantly, against democracy itself,” he wrote at the time.
Antonio García-Martinez, Facebook’s first targeted ads manager, thinks so too. “For certain classes of advertising, like politics, a random schmo with a credit card shouldn’t just be able to randomly run ads over the entire Facebook system,” he told me.
Except... that has literally nothing to do with what the Cambridge Analytica controversy is all about. And, anyway, as we've discussed before, the Russians bent over backwards to pretend to be Americans when buying ads, so it's not like KYC rules would really have helped for the ads. And the whole Cambridge Analytica may have involved some ads (and lots of other stuff), but Facebook knew who "the customer" was in that instance. And it knew how an "academic" was slurping up some data for "academic research." Knowing your customer wouldn't have made the slightest difference at all here.
Even Tim Berners-Lee, who recently stirred the pot by suggesting regulations for social media doesn't seem to have much concrete information that would have mattered here.
What’s more, the fact that power is concentrated among so few companies has made it possible to weaponise the web at scale. In recent years, we’ve seen conspiracy theories trend on social media platforms, fake Twitter and Facebook accounts stoke social tensions, external actors interfere in elections, and criminals steal troves of personal data.
We’ve looked to the platforms themselves for answers. Companies are aware of the problems and are making efforts to fix them — with each change they make affecting millions of people. The responsibility — and sometimes burden — of making these decisions falls on companies that have been built to maximise profit more than to maximise social good. A legal or regulatory framework that accounts for social objectives may help ease those tensions.
I don't think Tim is wrong per se in arguing that there are issues in how much power is concentrated between a small group of large companies -- but I'm not sure that "a legal or regulatory framework" actually fixes any of that. Indeed, it seems highly likely to do the reverse.
As Ben Thompson notes in his own post about this mess, most of the regulatory suggestions being proffered will lock in Facebook as an entrenched incumbent. That's because it will (a) create barriers that Facebook can deal with, but startups cannot and (b) focus on "cementing" Facebook's model (with safeguards) rather than letting the next wave of creative destruction take down Facebook.
It seems far more likely that Facebook will be directly regulated than Google; arguably this is already the case in Europe with the GDPR. What is worth noting, though, is that regulations like the GDPR entrench incumbents: protecting users from Facebook will, in all likelihood, lock in Facebook’s competitive position.
This episode is a perfect example: an unintended casualty of this weekend’s firestorm is the idea of data portability: I have argued that social networks like Facebook should make it trivial to export your network; it seems far more likely that most social networks will respond to this Cambridge Analytica scandal by locking down data even further. That may be good for privacy, but it’s not so good for competition. Everything is a trade-off.
Note that last bit? A good way to take away Facebook's dominance is to enable others to compete in the space. The best way to do that? Make it easy for people to switch from Facebook to upstart competitors. The best way to do that? Make it easier for Facebook users to export their data... and use it on another service. But as soon as you do that, you're actually right back into the risky zone. Why is Facebook in so much hot water right now? Because it made it too easy to export user data to third party platforms! And, any kind of GDPR-type solution is just going to lock down that data, rather than enabling them to help seed competition.
Cory Doctorow, over at EFF, has what I think is the most reasonable idea of all: enable third parties to build tools that help Facebook's (and every other platform's!) users better manager and understand their privacy settings and what's being done with their data. That's an actual solution to the problem we laid out in the previous post: Facebook's failed transparency. Doctorow compares the situation to ad-blockers. Ads became too intrusive, and users were able to make use of 3rd party services to stop the bad stuff. We should be able to do something similar with privacy and data controls. But, thanks to some pretty dumb laws and court rulings (including a key one that Facebook itself caused), that's really not possible:
This week, we made you a tutorial explaining the torturous process by which you can change your Facebook preferences to keep the company’s “partners” from seeing all your friends’ data. But what many folks would really like to do is give you a tool that does it for you: go through the tedious work of figuring out Facebook’s inscrutable privacy dashboard, and roll that expertise up in a self-executing recipe — a piece of computer code that autopiloted your browser to login to Facebook on your behalf and ticked all the right boxes for you, with no need for you to do the fiddly work.
But they can’t. Not without risking serious legal consequences, at least. A series of court decisions — often stemming from the online gaming world, sometimes about Facebook itself — has made fielding code that fights for the user into a legal risk that all too few programmers are willing to take.
That's a serious problem. Programmers can swiftly make tools that allow us to express our moral preferences, allowing us to push back against bad behavior long before any government official can be convinced to take an interest — and if your government never takes an interest, or if you are worried about the government's use of technology to interfere in your life, you can still push back, with the right code.
So if we really, truly, want to deal with the problem, then we need to push for more control by the end users. Let users control and export their data, and let people build tools that allow them to do so, and to control and transparently understand what others do with their data.
If someone comes up with a "regulatory regime" that does that, it would be fantastic. But so far, nearly every suggestion I've seen has gone in the other direction. They will do things like force Facebook to "lock down" its data even more, making it harder for users to extract it, or for third parties to provide users the tools they need to control their own data. They'll put useless, but onerous, Know Your Customer rules that Facebook will be able to throw money at to solve, but every smaller platform will find incredibly costly.
I'm not optimistic about how all of this works out. Even if you absolutely hate Facebook and think the company is evil, doesn't care one wit about your privacy, and is run by the most evil person on the planet, you should be especially worried with the regulatory suggestions that are coming. They're not going to help. They're going to entrench Facebook and lock down your data.
Filed Under: data, elections, know your customer, openness, platforms, regulations, social media, transparency
Companies: cambridge analytica, facebook