NY Times And Washington Post Criticize Facebook Because The Chicago Tribune Had A Terrible Headline
from the wait,-whose-fault-is-it-now? dept
I'm going to try, once again, to do that stupid thing where I try to express a nuanced point on the internet, where there's a high likelihood of it being misunderstood. So, consider this opening a warning that you should read this entire article to try to get at the larger point.
And, along those lines, there are two parts to this story, and while much of it is going to point some fingers at the NY Times and Washington Post in how they presented a story that suggested blaming Facebook for something that isn't actually a Facebook issue, that shouldn't be seen as letting Facebook off the hook, because it doesn't come out of this story looking very good either. Basically, this is a story that shows how much more complex and complicated our information ecosystem is when it comes to misinformation, and simple blame games aren't necessarily that effective.
But, first, some background: for a long time, NY Times reporter Kevin Roose has used Facebook's own CrowdTangle tool to highlight what content on Facebook was getting the most engagement. It is a consistently useful tool in showing how claims that Facebook has an "anti-conservative bias" is bullshit. It constantly shows top "conservative" personalities like Ben Shapiro, Don Bongino, and others as having the most engagement on the site.
For reasons I don't fully understand, Facebook has always hated this, and has spent so much wasted effort repeatedly insisting that Roose's tracking of the numbers is not telling an accurate picture of what's happening on the site (even though he's using Facebook's own tool). Last week, Facebook launched a new offering which it seemed to hope would change the narrative on this. It's called the "Widely Viewed Content Report" (catchy!). And, obviously, it is true that "engagement" (what CrowdTangle shows) is not the be-all, end-all of what's happening on the site, but it is kinda weird how annoyed Facebook gets about the lists. You can almost hear the defensiveness in how they introduced this new report:
In this first quarterly report, our goal is to provide clarity around what people see in their Facebook News Feed, the different content types that appear in their Feed and the most-viewed domains, links, Pages and posts on the platform during the quarter. We believe this new report, when paired with the engagement data available in CrowdTangle, represents a more complete picture of what people see on Facebook. We plan to expand the scope of this report in future iterations.
And, to be fair, there is a bunch of interesting stuff in this report. It shows that, despite all the focus on Facebook links to outside sources, apparently 87% of content viewed in Facebook's News Feed doesn't even link to an outside source. And much of the rest of the report really leans hard on the fact that for most people, politics and news (and disinformation) are not a huge part of their Facebook experience. That's certainly very interesting, though it would be nicer if Facebook exposed the raw data, rather than doing this as quarterly reports.
The list doesn't just make Facebook look good though. As disinfo reporter Brandy Zadrozny dug out, one of the top links is a subscription link to the propaganda peddling Epoch Times, which is technically banned from advertising on Facebook, but the publication figured out something of a workaround through viral posts:
How did the Epoch Times—an outlet barred from Facebook advertising for repeatedly violating its policies—get its subscription link to all those people? A search in CrowdTangle shows us that it's through viral posts about "cute toddlers," dogs, and homeless people getting cash. pic.twitter.com/a6MmZS1DMV
— Brandy Zadrozny (@BrandyZadrozny) August 19, 2021
Anyway, a couple days after Facebook released all this, the NY Times came out with what is a legitimate scoop: Facebook had actually planned to release a version of this report earlier this year, but (according to the article) shelved it when the most-viewed link didn't look very good for Facebook. From the article:
In that report, a copy of which was provided to The Times, the most-viewed link was a news article with a headline suggesting that the coronavirus vaccine was at fault for the death of a Florida doctor. The report also showed that a Facebook page for The Epoch Times, an anti-China newspaper that spreads right-wing conspiracy theories, was the 19th-most-popular page on the platform for the first three months of 2021.
The report was nearing public release when some executives, including Alex Schultz, Facebook’s vice president of analytics and chief marketing officer, debated whether it would cause a public relations problem, according to the internal emails. The company decided to shelve it.
Now, the fact that Facebook would shelve the report because of the possible optics is bad. Flat out. No question about it at all. It also demonstrates why having to sit around and rely on Facebook to release this report every quarter rather than just sharing the data is always going to be questionable and not engender much trust.
But... (and this is important), the NY Times piece kinda buries something rather important here. That "most-viewed link" is not to some random disinformation site, or sketchy propaganda spewing news organization. It was... to an article from the (Pulitzer Prize winning) South Florida Sun Sentinel and then republished in the Chicago Tribune. The article was about a doctor who died after getting the COVID vaccine, and the title highlighted the "the CDC is investigating why." Notably, the NY Times published a very similar story to the one in the Chicago Tribune.
A day after the NY Times article, the Washington Post also published an article with the blaring headline: Facebook says post that cast doubt on covid-19 vaccine was most popular on the platform from January through March. It, too, completely buries the fact that the "casting doubt" came from the Sun Sentinel and Chicago Tribune.
The NY Times mentions the source of the "viral" article in the 13th paragraph. The WaPo article mentions it in the 5th paragraph.
To be honest, the NY Times article and framing is more defensible. It focuses on Facebook's (highly questionable) decision to shelve the report until there was a better link in the top slot. The reporters on the Times piece -- Davey Alba and Ryan Mac, both of whom, I should note, I think are generally top notch reporters on the tech beat -- have taken issue with people calling out this aspect of their reporting. Mac pointed out that the real issue wasn't so much the problems of the original Tribune/Sentinel story (which the NY Times also had a version of), but rather the users on Facebook sensationalizing the story to imply something about the vaccine. Alba, similarly, points out that the real news is the fact that the reason this article was so popular was that it was shared widely by anti-vax groups.
And... I can see that. But, somewhat ironically, all weekend on Twitter, I kept seeing Facebook haters sharing those NY Times and Washington Post articles in the exact same way that Alba and Mac are complaining about -- by misrepresenting the key newsworthy points, and instead using it to reinforce their prior beliefs (that Facebook was somehow bad).
Very, very, very few people were noting that this is all a lot more complex. It reminds me, yet again, of the research from Yochai Benkler that showed that disinformation really only goes viral on social media after more mainstream media presents a story.
The fact is the ecosystem is a lot more complex, and the ways to deal with it are a lot more nuanced than many people seem to want to discuss or admit. The framing of the Tribune/Sentinel articles was done in a way that it was easily predictable that anti-vaxxers would take the info and run with it. And, frankly, the framing of the NYT and WaPo articles were done similarly in a manner that anti-Facebookers would take the info and run with it. But the reality with both of these stories, and many more, is that they are a lot more complex. The media's role in all of this is important too: how they frame stories, and how those stories might be "weaponized" by those with narratives to tell matters. But we don't see much discussion of that.
Instead, everyone just wants to focus on Facebook, Facebook, Facebook. And, yes, the discussion about Facebook obviously needs to be a big part of the discussion. And Facebook just has a way in which it constantly seems to do the wrong thing at the wrong time (and it's handling of this mess isn't great). But it's not Facebook that wrote that Sun Sentinel article. So, why does Facebook deserve more of the blame for the fact that people took it and misrepresented it than the Sun Sentinel (and the Chicago Tribune who redistributed it)?
And, at a deeper level, if the idea is to point fingers -- who is more able to have helped to prevent that mess? Is it Facebook? For that to be true, Facebook would have to somehow know to distrust an article from the Chicago Tribune that it has no deeper knowledge of. That seems like a really big ask. On the other hand, it seems that the Sun Sentinel -- including the reporters and editors on that story -- as well as the Chicago Tribune, who actively chose to redistribute the story, all would have much more (a) knowledge and (b) ability to recognize the potential risk of how the story is framed, and (c) the ability to fix it.
We're not going to solve the problems associated with a bunch of people believing in nonsense if we ignore the underlying parties responsible for the content itself, and focus just on the intermediaries. That doesn't mean to ignore Facebook, but it's a reminder to view the overall ecosystem.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: blame, disinformation, engagement, journalism, reporting, vaccines
Companies: facebook, ny times, washington post
Reader Comments
Subscribe: RSS
View by: Time | Thread
Clamoring For A Gatekeeper
If someone personally leans left in their political beliefs, but then sees that perhaps Foxnews had the best viewership numbers last week, it doesn't mean that this individual now approves of Foxnews' presentation. This person likely still legitimately disagrees with Foxnews, despite its popularity.
Similarly, engagement numbers do not rule out bias. It is possible for a site to be biased, and yet the thing that they are biased against may still experience high engagement, despite their best efforts to shape a different outcome.
Anyone can write it. But only FB is in a position position to put their thumb on the scale. They want FB to press harder.
[ link to this | view in chronology ]
Re: Clamoring For A Gatekeeper
It is also possible that you are so biased towards republican propaganda that everything can be twisted to your belief system.
[ link to this | view in chronology ]
Re: Re: Clamoring For A Gatekeeper
Perhaps. However, I have a fairly simple explanation for why things are the way they are, whereas others are perplexed and confused by these seemingly insane events and behaviors. It reinforces the idea that I am biased, and correct.
[ link to this | view in chronology ]
That would only explain how left-leaning content ends up being somewhat popular on Facebook in spite of the attempts by Facebook to bias the site in favor of conservative content (for whatever reason).
No, they really aren’t. Any site can bias its algorithms and moderation and such in favor of one type of content or another. Your issue isn’t that Facebook has its thumb on the scale—your issues are that Facebook is popular and Facebook seemingly has a left-leaning bias (despite the evidence that says otherwise).
A simple explanation isn’t always the right one. I could say that COVID-19 exists because “some asshole in China willingly unleashed a new plague”; that may be a simple explanation for the pandemic, but it is my no means the correct one.
Saying you’re correct doesn’t make it so. Saying you’re biased, on the other hand… 👀
[ link to this | view in chronology ]
Re: Re: Re: Clamoring For A Gatekeeper
For every complex problem there is an answer that is clear, simple, and wrong.
-- H. L. Mencken
[ link to this | view in chronology ]
Re: Re: Re: Clamoring For A Gatekeeper
Through confirmation bias and the Dunning-Kruger effect, no less.
[ link to this | view in chronology ]
Re: Clamoring For A Gatekeeper
Which conservative stuff is it against which they are biased?
[ link to this | view in chronology ]
Re: Clamoring For A Gatekeeper
So Facebook treating right-wing posters with kid gloves, going over and above making sure they are not moderated means that they are still biased against right-wing posters? How do you live in such a constant state of delusional cognitive dissonance?
Also, why do you refuse to give us any specific examples of conservative opinions being censored solely because they are conservative opinions?
I will keep asking you this until you provide an answer, otherwise, give me a reason why your comments shouldn't just be flagged as trolling and ignored.
[ link to this | view in chronology ]
The belief that people can be prevented from believing in nonsense by deleting it from social media, is nonsense.
People have believed in nonsense all throughout history. The only solution to ignorance is education. Hiding things makes them look like there is something to hide.
[ link to this | view in chronology ]
When Facebook didn't exist, these mofos didn't think to blame the air or pens. Stupid people are gonna stupid. While Facebook may amplify some stuff, that stuff always existed and people spread it gullibly and with an agenda.
The largest real effect of FB et al, it would seem, is that the rest of us get to see how much bullshit some other people want to consume and spread, and who some of them are.
i'd call that a net positive. and i really don't care much for FB at all.
[ link to this | view in chronology ]
Re:
I don't remember anyone blaming newspapers for people quoting them out of context before Facebook, either.
[ link to this | view in chronology ]
"Kill the messenger" is still a fan favorite apparently.
[ link to this | view in chronology ]
Nuance?
"that stupid thing where I try to express a nuanced point on the internet"
I look forward to deliberating misinterpreting everything you're trying to say Mike.
[ link to this | view in chronology ]
Re: Nuance?
Koby's life motto!
[ link to this | view in chronology ]
Careful with those glass houses
Before newspapers rip into Facebook for allowing or not doing enough to keep the nutters off their platform they might want to do a little housecleaning of their own, as I distinctly remember several previous TD articles about people lying and/or misrepresenting subjects such as 230 being given space on newspapers not to mention the example listed in this article where their coverage was sloppy and easily warped/used/misread by people.
[ link to this | view in chronology ]
You funny!
"...you should read this entire article to try to get at the larger point" Ha. Haha. Hahahaha!! Yeah, right.
[ link to this | view in chronology ]
"So, why does Facebook deserve more of the blame for the fact that people took it and misrepresented it than the Sun Sentinel (and the Chicago Tribune who redistributed it)?"
Because FaceBook has billions of users, far more than the Sun Sentinel or Chigago Tribune? Because they buried the first report when they didn't like what it revealed? Because Facebook is where this stuff gets contextualised as anti-vax?
Because sometimes you're so busy going on about other people missing nuance that you ignore it yourself?
[ link to this | view in chronology ]
Re:
Because FaceBook has billions of users, far more than the Sun Sentinel or Chigago Tribune?
I'm not sure how that changes the fact that if the original article were written better none of this would have been a problem. You ignore the point of the article, that Facebook is not the one in the best position to fix this.
Because they buried the first report when they didn't like what it revealed?
Yes. I called them out for that bullshit act. Did you not read the fucking article?
Because Facebook is where this stuff gets contextualised as anti-vax?
Yes, because the original articles were poorly written. Note that Facebook is not the one who contextualized it. Why do you blame the tool?
Because sometimes you're so busy going on about other people missing nuance that you ignore it yourself?
Nothing you point out refers to any nuance I missed. Care to try again?
[ link to this | view in chronology ]