from the wait,-whose-fault-is-it-now? dept
I'm going to try, once again, to do that stupid thing where I try to express a nuanced point on the internet, where there's a high likelihood of it being misunderstood. So, consider this opening a warning that you should read this entire article to try to get at the larger point.
And, along those lines, there are two parts to this story, and while much of it is going to point some fingers at the NY Times and Washington Post in how they presented a story that suggested blaming Facebook for something that isn't actually a Facebook issue, that shouldn't be seen as letting Facebook off the hook, because it doesn't come out of this story looking very good either. Basically, this is a story that shows how much more complex and complicated our information ecosystem is when it comes to misinformation, and simple blame games aren't necessarily that effective.
But, first, some background: for a long time, NY Times reporter Kevin Roose has used Facebook's own CrowdTangle tool to highlight what content on Facebook was getting the most engagement. It is a consistently useful tool in showing how claims that Facebook has an "anti-conservative bias" is bullshit. It constantly shows top "conservative" personalities like Ben Shapiro, Don Bongino, and others as having the most engagement on the site.
For reasons I don't fully understand, Facebook has always hated this, and has spent so much wasted effort repeatedly insisting that Roose's tracking of the numbers is not telling an accurate picture of what's happening on the site (even though he's using Facebook's own tool). Last week, Facebook launched a new offering which it seemed to hope would change the narrative on this. It's called the "Widely Viewed Content Report" (catchy!). And, obviously, it is true that "engagement" (what CrowdTangle shows) is not the be-all, end-all of what's happening on the site, but it is kinda weird how annoyed Facebook gets about the lists. You can almost hear the defensiveness in how they introduced this new report:
In this first quarterly report, our goal is to provide clarity around what people see in their Facebook News Feed, the different content types that appear in their Feed and the most-viewed domains, links, Pages and posts on the platform during the quarter. We believe this new report, when paired with the engagement data available in CrowdTangle, represents a more complete picture of what people see on Facebook. We plan to expand the scope of this report in future iterations.
And, to be fair, there is a bunch of interesting stuff in this report. It shows that, despite all the focus on Facebook links to outside sources, apparently 87% of content viewed in Facebook's News Feed doesn't even link to an outside source. And much of the rest of the report really leans hard on the fact that for most people, politics and news (and disinformation) are not a huge part of their Facebook experience. That's certainly very interesting, though it would be nicer if Facebook exposed the raw data, rather than doing this as quarterly reports.
The list doesn't just make Facebook look good though. As disinfo reporter Brandy Zadrozny dug out, one of the top links is a subscription link to the propaganda peddling Epoch Times, which is technically banned from advertising on Facebook, but the publication figured out something of a workaround through viral posts:
Anyway, a couple days after Facebook released all this, the NY Times came out with what is a legitimate scoop: Facebook had actually planned to release a version of this report earlier this year, but (according to the article) shelved it when the most-viewed link didn't look very good for Facebook. From the article:
In that report, a copy of which was provided to The Times, the most-viewed link was a news article with a headline suggesting that the coronavirus vaccine was at fault for the death of a Florida doctor. The report also showed that a Facebook page for The Epoch Times, an anti-China newspaper that spreads right-wing conspiracy theories, was the 19th-most-popular page on the platform for the first three months of 2021.
The report was nearing public release when some executives, including Alex Schultz, Facebook’s vice president of analytics and chief marketing officer, debated whether it would cause a public relations problem, according to the internal emails. The company decided to shelve it.
Now, the fact that Facebook would shelve the report because of the possible optics is bad. Flat out. No question about it at all. It also demonstrates why having to sit around and rely on Facebook to release this report every quarter rather than just sharing the data is always going to be questionable and not engender much trust.
But... (and this is important), the NY Times piece kinda buries something rather important here. That "most-viewed link" is not to some random disinformation site, or sketchy propaganda spewing news organization. It was... to an article from the (Pulitzer Prize winning) South Florida Sun Sentinel and then republished in the Chicago Tribune. The article was about a doctor who died after getting the COVID vaccine, and the title highlighted the "the CDC is investigating why." Notably, the NY Times published a very similar story to the one in the Chicago Tribune.
A day after the NY Times article, the Washington Post also published an article with the blaring headline: Facebook says post that cast doubt on covid-19 vaccine was most popular on the platform from January through March. It, too, completely buries the fact that the "casting doubt" came from the Sun Sentinel and Chicago Tribune.
The NY Times mentions the source of the "viral" article in the 13th paragraph. The WaPo article mentions it in the 5th paragraph.
To be honest, the NY Times article and framing is more defensible. It focuses on Facebook's (highly questionable) decision to shelve the report until there was a better link in the top slot. The reporters on the Times piece -- Davey Alba and Ryan Mac, both of whom, I should note, I think are generally top notch reporters on the tech beat -- have taken issue with people calling out this aspect of their reporting. Mac pointed out that the real issue wasn't so much the problems of the original Tribune/Sentinel story (which the NY Times also had a version of), but rather the users on Facebook sensationalizing the story to imply something about the vaccine. Alba, similarly, points out that the real news is the fact that the reason this article was so popular was that it was shared widely by anti-vax groups.
And... I can see that. But, somewhat ironically, all weekend on Twitter, I kept seeing Facebook haters sharing those NY Times and Washington Post articles in the exact same way that Alba and Mac are complaining about -- by misrepresenting the key newsworthy points, and instead using it to reinforce their prior beliefs (that Facebook was somehow bad).
Very, very, very few people were noting that this is all a lot more complex. It reminds me, yet again, of the research from Yochai Benkler that showed that disinformation really only goes viral on social media after more mainstream media presents a story.
The fact is the ecosystem is a lot more complex, and the ways to deal with it are a lot more nuanced than many people seem to want to discuss or admit. The framing of the Tribune/Sentinel articles was done in a way that it was easily predictable that anti-vaxxers would take the info and run with it. And, frankly, the framing of the NYT and WaPo articles were done similarly in a manner that anti-Facebookers would take the info and run with it. But the reality with both of these stories, and many more, is that they are a lot more complex. The media's role in all of this is important too: how they frame stories, and how those stories might be "weaponized" by those with narratives to tell matters. But we don't see much discussion of that.
Instead, everyone just wants to focus on Facebook, Facebook, Facebook. And, yes, the discussion about Facebook obviously needs to be a big part of the discussion. And Facebook just has a way in which it constantly seems to do the wrong thing at the wrong time (and it's handling of this mess isn't great). But it's not Facebook that wrote that Sun Sentinel article. So, why does Facebook deserve more of the blame for the fact that people took it and misrepresented it than the Sun Sentinel (and the Chicago Tribune who redistributed it)?
And, at a deeper level, if the idea is to point fingers -- who is more able to have helped to prevent that mess? Is it Facebook? For that to be true, Facebook would have to somehow know to distrust an article from the Chicago Tribune that it has no deeper knowledge of. That seems like a really big ask. On the other hand, it seems that the Sun Sentinel -- including the reporters and editors on that story -- as well as the Chicago Tribune, who actively chose to redistribute the story, all would have much more (a) knowledge and (b) ability to recognize the potential risk of how the story is framed, and (c) the ability to fix it.
We're not going to solve the problems associated with a bunch of people believing in nonsense if we ignore the underlying parties responsible for the content itself, and focus just on the intermediaries. That doesn't mean to ignore Facebook, but it's a reminder to view the overall ecosystem.
Filed Under: blame, disinformation, engagement, journalism, reporting, vaccines
Companies: facebook, ny times, washington post