from the it's-not-evil,-it's-not-incompetent dept
Over the last few weeks, the WSJ has run a series of posts generally called "The Facebook Files," which have exposed a variety of internal documents from Facebook that are somewhat embarrassing. I do think some of the reporting is overblown -- and, in rather typical fashion regarding the big news publications and their reporting on Facebook, presents everything in the worst possible light. For example, the report on how internal research showed that Instagram made teen girls feel bad about themselves downplays that the data actually shows a significantly higher percentage of teens indicated that Instagram made them feel better:
But, of course, the WSJ's headline presents it very differently:
None of this is to say that this is okay, or that Facebook shouldn't be trying to figure out ways to minimize people using the sites being made to feel worse about themselves. But the reporting decisions here do raise some questions.
Another one of the articles highlights how Facebook has different rules for different users with regards to content moderation. And, again, on a first pass this sounds really damning:
The program, known as “cross check” or “XCheck,” was initially intended as a quality-control measure for actions taken against high-profile accounts, including celebrities, politicians and journalists. Today, it shields millions of VIP users from the company’s normal enforcement process, the documents show. Some users are “whitelisted”—rendered immune from enforcement actions—while others are allowed to post rule-violating material pending Facebook employee reviews that often never come.
The report notes that nearly 6 million people are on that list (including, somewhat ironically, some high profile conservatives who have whined about how the site's policies are biased against them). There's simply no way this looks good -- as insiders at Facebook readily admitted in internal documents. But, rather than being particularly nefarious, there are some pretty understandable reasons why this might have come about. As early Facebook content policy guy, Dave Willner, explained on Twitter, such systems are pretty common and "they get proposed spontaneously as a way of controlling the chaos that occurs when moderation affects high-profile or influential accounts."
Another way to think about this is that -- as I've said often -- one of the trickiest parts of content moderation is understanding the wider context of any individual piece of content. This is way more difficult than most people realize. But context actually matters and looking at an individual piece of content outside of the context in which it was created gives you a misleading picture of the intent and impact of that content. And the idea of a sort of "be more careful with these accounts" list is a very inexact, and very messy, but is also a very quick and easy way of adding a contextual layer. It is basically initiated as a tool to say "hey, be careful with these accounts, because making a mistake with them has extra consequences." But, over time, because of human nature, it evolves into "these accounts are protected."
This isn't a defense of the list. As Robyn Caplan and Tarleton Gillespie rightly note in the Washington Post, no matter what the reasons for the list, any sort of setup that is (1) not transparent, and (2) seen to lead to unequal treatment very naturally breeds suspicion and distrust.
In the days since the Facebook Files came out I've been having a few conversations with people about the write-ups and what it all means. There is general agreement that none of this makes Facebook look good. And it shouldn't. There's really nothing in all of this that's good for Facebook. A few of the discussions, though, jumped to the argument that Facebook's executive team is "evil." A lesser version of this is that they're totally incompetent. I don't think either is quite true. Thinking it through, I think Facebook's executive team (1) is in deeper than they realize, and (2) falsely thinks it has a better handle on things than it really does.
This is an issue that is all too common, especially in the internet world, where there's a kind of myth around "visionary" founders. And that's certainly been applied to pretty much every successful internet founder. Indeed, recent research throws some cold water on the idea of brilliant founders leading to big breakthroughs, and suggests more that successful companies are about being in the right place at the right time with minimally competent leaders to keep everything from going off the rails.
This is (quite obviously) from the outside looking in, but my impression of much of Facebook leadership is that they've bought a bit too much into the myth of their own brilliance, and their own ability to work their way through challenges -- at the size and scale of an operation that isn't just providing a service to basically a third of the globe, but which is also seeking to get those people to interact with one another. Nearly all of human history is about our general failures to get along with one another, and many of the problems facing Facebook are also of that very nature.
There is something of an open question about which of these problems are merely revealed or exposed due to Facebook, which are exacerbated by Facebook, and which are actually decreased by Facebook. It seems likely that all three of these forces are at play, and there is no one who has a full grasp on how to deal with the problems that are a part of human nature -- or how to try to minimize humanity's worst impulses across the globe. I don't think Facebook has the answers, but sometimes I fear that some of the leaders at the company think that they do -- or that they can outthink the world in how they approach these problems.
The issues, which become clear in all of this reporting, are not of a company that is nefariously run by evil geniuses toying with people's minds (as some would have you believe). Nor is it incompetent buffoons careening human society into a ditch under a billboard screaming "MOAR ENGAGEMENT! MOAR CLICKS!" It seems pretty clear that they are decently smart, and decently competent people... who have ended up in an impossible situation and don't recognize that they can't solve it all alone.
Over and over again, this recognition seems to explain actions that might otherwise be incomprehensible. So many of the damning reports from the Facebook files could be evidence of evilness or incompetence -- or they could be evidence of a group of executives who are in way too deep, but believe that they really have a handle on things that they not only don't, but simply can't due to the nature of humanity itself.
Facebook and Instagram were never going to cure depression, or keep teen girls from feeling bad about themselves. And, hell, to give the company a tiny bit of credit (don't worry, I'll take it away in a moment), the very fact that they did this research in the first place and realized how shitty some people feel on the site is a step that many, many companies never take. On the flip side, how they actually handled this is a part of the problem. As Will Oremus points out at the Washington Post, the real issue here is in the burying of the findings, not necessarily in the findings themselves.
And, again viewed through the prism of a hubristic "we can fix this, because we're brilliant!" mentality, you can see how that happens. There's a correct realization that this report will look bad -- so it can't be talked about publicly, because Facebook seems to have an inferiority complex about ever looking bad. But the general belief seems to be that if they just keep working on it, and just make the latest change to the UI or the algorithm, maybe, just maybe, they can "show" that they've somehow improved humanity. It's a belief in themselves and their ability that simply isn't realistic. But it does explain how the company handles nearly all of these scandals.
One thing I've grown to appreciate as I've gotten older is how much more complex the world is than it seems at times. More and more often, I've realized how the complex interplay of different variables means that nobody understands anything perfectly. There's always another variable (or several) missing. I'm realizing that I am (increasingly) valuing input from a larger and more diverse set of sounding boards that can point out the giant thing that I'm missing in trying to understand complex topics. But it also makes me that much more skeptical of those who act as though they have figured it all out.
I don't think the Facebook Files show a company that is evil or incompetent. It seems to show a company that is in way too deep, but still thinks it can totally get things back under control.
Filed Under: depression, evil, humanity, incompetence, mark zuckerberg, scandals, social media, social problems
Companies: facebook