Let Me Rewrite That For You: Washington Post Misinforms You About How Facebook Weighted Emoji Reactions
from the let's-clean-it-up dept
Journalist Dan Froomkin, who is one of the most insightful commentators on the state of the media today, recently began a new effort, which he calls "let me rewrite that for you," in which he takes a piece of journalism that he believes misled readers, and rewrites parts of them -- mainly the headline and the lede -- to better present the story. I think it's a brilliant and useful form of media criticism that I figured I might experiment with as well -- and I'm going to start it out with a recent Washington Post piece, one of many the Post has written about the leaked Facebook Files from whistleblower Frances Haugen.
The piece is written by reporters Jeremy Merrill and Will Oremus -- and I'm assuming that, like many mainstream news orgs, editors write the headlines and subheads, rather than the reporters. I don't know Merrill, but I will note that I find Oremus to be one of the most astute and thoughtful journalists out there today, and not one prone to fall into some of the usual traps that journalists fall for -- so this one surprised me a bit (though, I'm also using this format on an Oremus piece, because I'm pretty sure he'll take the criticism in the spirit intended -- to push for better overall journalism on these kinds of topics). The article's headline tells a story in and of itself: Five points for anger, one for a ‘like’: How Facebook’s formula fostered rage and misinformation, with a subhead that implies something similar: "Facebook engineers gave extra value to emoji reactions, including ‘angry,’ pushing more emotional and provocative content into users’ news feeds." There's also a graphic that reinforces this suggested point: Facebook weighted "anger" much more than happy reactions. And it's all under the "Facebook under fire" designation:
Seeing this headline and image, it would be pretty normal for you to assume the pretty clear implication: people reacting happily (e.g. with "likes") on Facebook had those shows of emotions weighted at 1/5th the intensity of people reacting angrily (e.g. with "anger" emojis) and that is obviously why Facebook stokes tremendous anger, hatred and divisiveness (as the story goes).
But... that's not actually what the details show. The actual details show that initially when Facebook introduced its list of five different "emoji" reactions (to be added to the long iconic "like" button), it weighted all five of them as five times as impactful as a like. That means that "love," "haha," "wow," and "sad" also were weighted at 5 times a single like, and identical to "angry." And while the article does mention this in the first paragraph, it immediately pivots to focus only on the "angry" weighting and what that means. When combined with the headline and the rest of the article, it's entirely possible to read the article and not even realize that "love," "sad," "haha," and "wow" were also ranked at 5x a single "like" and to believe that Facebook deliberately chose to ramp up promotion of "anger" inducing content. It's not only possible, it's quite likely. Hell, it's how I read the article the first time through, completely missing the fact that it applied to the other emojis as well.
The article also completely buries how quickly Facebook realized this was an issue and adjusted the policy. While it does mention it, it's very much buried late in the story, as are some other relevant facts that paint the entire story in a very different light than the way many people are reading it.
As some people highlighted this, Oremus pointed out that the bigger story here is "how arbitrary initial decisions, set by humans for business reasons, become reified as the status quo." And he's right. That is the more interesting story and one worth exploring. But that's not how this article is presented at all! And, his own article suggested the "reified as the status quo" part is inaccurate as well, though, again, that's buried further down in the story. The article is very much written in a way where the takeaway for most people is going to be "Facebook highly ranks posts that made you angry, because stoking divisiveness was good for business, and that's still true today." Except none of that is accurate.
So... let's rewrite that, and try to better get across the point that Oremus claims was the intended point of the story.
The original title, again is:
Five points for anger, one for a ‘like’: How Facebook’s formula fostered rage and misinformation
Let's rewrite that:
Facebook weighted new emojis much more than likes, leading to unintended consequences
Then there's the opening of the piece, which does mention very quickly that it applied to all five new emojis, but quickly pivots to just focusing on the anger:
Five years ago, Facebook gave its users five new ways to react to a post in their news feed beyond the iconic “like” thumbs-up: “love,” “haha,” “wow,” “sad” and “angry.”
Behind the scenes, Facebook programmed the algorithm that decides what people see in their news feeds to use the reaction emoji as signals to push more emotional and provocative content — including content likely to make them angry. Starting in 2017, Facebook’s ranking algorithm treated emoji reactions as five times more valuable than “likes,” internal documents reveal. The theory was simple: Posts that prompted lots of reaction emoji tended to keep users more engaged, and keeping users engaged was the key to Facebook’s business.
Facebook’s own researchers were quick to suspect a critical flaw. Favoring “controversial” posts — including those that make users angry — could open “the door to more spam/abuse/clickbait inadvertently,” a staffer, whose name was redacted, wrote in one of the internal documents. A colleague responded, “It’s possible.”
The warning proved prescient. The company’s data scientists confirmed in 2019 that posts that sparked angry reaction emoji were disproportionately likely to include misinformation, toxicity and low-quality news.
Let's rewrite that, both using what Oremus claims was the "bigger story" in the article, and some of the information that is buried much later.
Five years ago, Facebook expanded the ways that users could react to posts beyond the iconic "like" thumbs-up, adding five more emojis: "love," "haha," "wow," "sad," and "angry." With this new addition, Facebook engineers needed to determine how to weight these new engagement signals. Given the stronger emotions portrayed in these emojis, the engineers made a decision that had a large impact on how the use of those emojis would be weighted in determining how to rank a story: each of those reactions would count for five times the weight of the classic "like" button. While Facebook did publicly say at the time that the new emojis would be weighted "a little more" than likes, and that all the new emojis would be weighted equally, it did not reveal that the weighting was actually five times as much.
This move came around the same time as Facebook's publicly announced plans to move away from promoting clickbait-style news to users, and to try to focus more on engagement with content posted by friends and family. However, it turned out that friends and family don't always post the most trustworthy information, and by overweighting the "emotional" reactions, this new move by Facebook often ended up putting the most emotionally charged content in front of users. Some of that content was joyful -- people reacting with "love" to engagements and births -- but some of it was disruptive and divisive, people reacting with "anger" to false or misleading content.
Facebook struggled internally with this result -- while also raising important points about how "anger" as a "core human emotion" is not always tied to bad things, and could be important for giving rise to protest movements against autocratic and corrupt governments. However, since other signals were weighted significantly more than even these emojis -- for example, replies to posts had a weight up to 30 times a single "like" click -- not much was initially done to respond to the concerns about how the weighting on anger might impact the kinds of content users were prone to see.
However, one year after launch, in 2018, Facebook realized weighting "anger" so highly was a problem, and downgraded the weighting on the "anger" emoji to four times a "like" while keeping the four other emoji, including "love," "wow," and "haha" at five times a like. A year later, the company realized this was not enough and even though "anger" is the least used emoji, by 2019 the company had put in place a mechanism to "demote" content that was receiving a disproportionate level of "anger" reactions. There were also internal debates about reranking all of the emoji reactions to create better news feeds, though there was not widespread agreement within the company about how best to do this. Eventually, in 2020, following more internal research on the impact of this weighting, Facebook reweighted all of the emoji. By the end of 2020 it had cut the weight of the "anger" emoji to zero -- taking it out of the equation entirely. The "haha" and "wow" emojis were weighted to one and a half times a like, and the "love" and "sad" were weighted to two likes.
From there, the article could then discuss a lot of what other parts of the article does discuss, about some of the internal debates and research, and also the point that Oremus raised separately, about the somewhat arbitrary nature of some of these ranking systems. But I'd argue that my rewrite presents a much more accurate and honest portrayal of the information than the current Washington Post article.
Anyone know how I can send the Washington Post an invoice?
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: algorithm, emoji, facebook papers, framing, frances haugen, journalism, let me rewrite that for you, ranking, reactions
Companies: facebook
Reader Comments
Subscribe: RSS
View by: Time | Thread
Misinformation
Another example of a newspaper becoming a source of misinformation, while blaming social media for its own stiring of the pot..
[ link to this | view in thread ]
Accuracy Isn't As Profitable
Your headline is more accurate, but clickbait headlines are written because they generate more views and more ad revenue. Without the sensationalism, you are costing them money. They are going to be tempted to bill YOU!
[ link to this | view in thread ]
I miss truth.
[ link to this | view in thread ]
It's pretty bad when Facebook is a "victim". I mean they're swine, they're an awful debasement of what should have been a free and noncommercial internet, they're a monopoly in need of busting, but they are NOT dissidents in need of implanted brain chips. Which is what the government wants - a big, powerful Facebook made utterly subservient to their needs, to be the buck overseer over the wretched proles.
[ link to this | view in thread ]
this explains a lot
Thanks Mike for delving into the details of how their algorithms work. Anecdotally, I did note a preponderance of "extreme" articles in my feed that has since toned down a bit.
Naive me thought it was my reactive comments that caused it...now I know better.
[ link to this | view in thread ]
Re: Accuracy Isn't As Profitable
So you approve of lying if it makes someone else money. Got it.
[ link to this | view in thread ]
Re: Accuracy Isn't As Profitable
There's other ways to generate active interaction than sensationalism, it just takes more work, and a lot of news organisations seem to have decided that actual journalism isn't worth paying for.
"Without the sensationalism, you are costing them money."
I cost them money anyway by not taking the bait and hiding news from sources that are particularly annoying in this regard.
[ link to this | view in thread ]
I really enjoyed this article, and would like to see more of them.
[ link to this | view in thread ]