When Facebook Turned Off Its News Feed Algorithm, It Made Everyone's Experience Worse... But Made Facebook More Money
from the oh,-look-at-that dept
For reasons I don't fully understand, over the last few months, many critics of "big tech" and Facebook, in particular, have latched onto the idea that "the algorithm" is the problem. It's been almost weird how frequently people insist to me that if only social media got rid of algorithmically recommending stuff, and went back to the old fashioned chronological news feed order, all would be good in the world again. Some of this seems based on the idea that algorithms are primed to lead people down a garden path from one type of video to ever more extreme videos (which certainly has happened, though how often is never made clear). Some of it seems to be a bit of a kneejerk reaction to simply disliking the fact that these companies (which many people don't really trust) are making decisions about what you may and may not like -- and that feels kinda creepy.
In the past few weeks, there's been a bit of a fever pitch on this topic, partly in response to whistleblower Frances Haugen's leak of documents, in which she argues that Facebook's algorithm is a big part of the problem. And then there's the recent attempt by some Democrats in Congress to take away Section 230 from algorithmically recommended information. As I noted, the bill is so problematic that it's not clear what it's actually solving.
But underlying all of this is a general opinion that "algorithms" and "algorithmic recommendations" are inherently bad and problematic. And, frankly, I'm confused by this. At a personal level, the tools I've used that do algorithmic recommendations (mainly: Google News, Twitter, and YouTube) have been... really, really useful? And also pretty accurate over time in learning what I want, and thus providing me more useful content in a more efficient manner, which has been pretty good for me, personally. I recognize that not everyone has that experience, but at the very least, before we unilaterally declare algorithms and recommendation engines as bad, it might help to understand how often they're recommending stuff that's useful and helpful, as compared to how often they're causing problems.
And, for all the talk about how Haugen's leaking has shown a light on the "dangers" of algorithms, the actual documents that she's leaked might suggest something else entirely. Reporter Alex Kantrowitz has reported on one of the leaked documents, regarding a study Facebook did on what happens when Facebook turns off the algorithmic rankings and... it was not pretty. But, contrary to common belief, Facebook actually made more money without the News Feed algorithm.
In February 2018, a Facebook researcher all but shut off the News Feed ranking algorithm for .05% of Facebook users. “What happens if we delete ranked News Feed?” they asked in an internal report summing up the experiment. Their findings: Without a News Feed algorithm, engagement on Facebook drops significantly, people hide 50% more posts, content from Facebook Groups rises to the top, and — surprisingly — Facebook makes even more money from users scrolling through the News Feed.
Considering how often we've heard, including from Haugen herself, that Facebook's decision-making is almost always driven by what will beneficially impact the bottom line the most, this deserves some consideration. Because the document... suggests something quite different. In fact, what the researchers seemed to find was that people hated it, but it made them spend more time on the site and see more ads because they had to poke around to try to find the interesting stuff they wanted to see, and that drove up ad rates. If Facebook were truly focused on just the bottom line, then, they should consider turning off the news feed algorithm -- or, just supporting the awful JAMA bill in Congress which will create incentives for the same result:
Turning off the News Feed ranking algorithm, the researcher found, led to a worse experience almost across the board. People spent more time scrolling through the News Feed searching for interesting stuff, and saw more advertisements as they went (hence the revenue spike). They hid 50% more posts, indicating they weren’t thrilled with what they were seeing. They saw more Groups content, because Groups is one of the few places on Facebook that remains vibrant. And they saw double the amount of posts from public pages they don’t follow, often because friends commented on those pages. “We reduce the distribution of these posts massively as they seem to be a constant quality compliant,” the researcher said of the public pages.
As always, there are lots of factors that go into this, and one experiment may not be enough to tell us much. Also, it's entirely possible that over time, the long term result would be less revenue because the increasing annoyances of not finding the more interesting stuff causes people to leave the platform entirely. But, at the very least, this leaked research pokes a pretty big hole in the idea that getting rid of algorithmic recommendations does anything particularly useful.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: algorithms, chronological, facebook files, facebook papers, frances haugen, jama, leaks, news feed, section 230, user experience
Companies: facebook
Reader Comments
Subscribe: RSS
View by: Time | Thread
I would disagree that hiding content indicates displeasure. I expect to see things I'm not interested in. I'm Facebook friends with old friends and family members I have little in common with anymore. I use it to keep up, not to engage. I need to know if someone died or got married or whatever.
And the most annoying thing about the algorithm is that it hides what you were previously looking at the next time you log in, including your own posts. You don't always engage with something the first time you see it, but Facebook seems to assume you're a four year old who needs new stimulation every time you log in.
The algorithm is patronizing and useless.
[ link to this | view in chronology ]
Re:
To you. Have you considered that you may not be a typical Facebook user?
[ link to this | view in chronology ]
Re: Re:
Is there really such a thing as a "typical" user on a site that has billions of users across all parts of the world, most cultures, most age groups, etc.?
But, always remember that the user is not Facebook's customer. If annoying a user who continues to use the site regardless is better for advertisers and Facebook's "real" customers, then they will do that.
[ link to this | view in chronology ]
Can I flag this article for being misleading at best?
I think the opinion ignores the elephant in the room, engagement. If engagement drops, it follows that advertising drops. The advertising revenue being artificially held steady makes for a misleading opinion.
[ link to this | view in chronology ]
Re:
I was thinking something similar - maybe someone could view the long-term - that while there might be a temporary spike in ad revenue, if users get ticked for long enough, there might very well be enough of an exodus (whether full-on account deletions, or just people who stop logging in any more) over the long haul that it wouldn't be worth it.
[ link to this | view in chronology ]
Re: Re:
A point covered in the article, so I have to ask you and the person you are responding to if you actually read the article.
[ link to this | view in chronology ]
Re: Re: Re:
Serious question, how was this addressed? It was acknowledged as possible, and then ignored. This opinion cites nothing, so again, how was this addressed?
[ link to this | view in chronology ]
Re: Re: Re: Re:
To discuss the long term impact of the experiment in any more detail would be pure speculation, and demanding that people do that is at the root of a lot of conspiracy theories and disinformation.
[ link to this | view in chronology ]
Re: Re: Re: Re:
Right here:
“What happens if we delete ranked News Feed?” they asked in an internal report summing up the experiment.
Their findings: Without a News Feed algorithm, engagement on Facebook drops significantly, people hide 50% more posts, content from Facebook Groups rises to the top, and — surprisingly — Facebook makes even more money from users scrolling through the News Feed.
[ link to this | view in chronology ]
Re: Re: Re:
Neither the original article nor Mike's summary here addressed my comment to the long-term effects - as engagement drops, what's the likelihood that users start dropping Facebook altogether? You can't make any money on a News Feed that users aren't scrolling through at all because they've stopped logging in...
[ link to this | view in chronology ]
Re: Re: Re: Re:
It's addressed right here:
"Also, it's entirely possible that over time, the long term result would be less revenue because the increasing annoyances of not finding the more interesting stuff causes people to leave the platform entirely."
The long term effects are not known, so nobody can answer the question with certainty. So what more do you want?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
Given that the long-term effects are unknown, the headline is misleading. And it seems that Facebook managers expect the long-term effects to be to reduce profits, otherwise they would have pursued this research further, and if it came out as really increasing profits, we would now have a Facebook that would look like what was tried out on these 0.05%.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
Or, your assumptions about how Facebook makes decisions are incorrect.
[ link to this | view in chronology ]
Re:
I think you're onto something. Although the study found that fb made more money because of the news feed scrolling, it could be just a short term effect. If a drug addict develops a tolerance, he might try to buy a lot more. But if the tolerance eventually results in no high whatsoever, then he might quit entirely. The social media drug dealers know that they can measure addiction through engagement, and they need it to maintain that constant advertising revenue stream. Fb must deliver the digital poison such that users never develop that tolerance and get burnt out.
[ link to this | view in chronology ]
Go turn in your homework, Koby.
[ link to this | view in chronology ]
Re:
Can I flag this article for being misleading at best?
You can do what you want. But it would help if you explained what was misleading.
I think the opinion ignores the elephant in the room, engagement. If engagement drops, it follows that advertising drops. The advertising revenue being artificially held steady makes for a misleading opinion.
Addressed in the piece. So which part was misleading?
[ link to this | view in chronology ]
Re:
Not really. Engagement like 'liking', commenting and sharing are not required for the feed to show ads. It just needs the user to scroll through it, which the study shows they spent more time on, hence more ads and more ad revenue.
[ link to this | view in chronology ]
The company that you keep
Like real world social institutes, you experience is dependent on the company that you keep. What I wonder is how many consider the truth to be extremism; (hello Bart).
[ link to this | view in chronology ]
But but but the algorithm killed Jesus.
(Yes it is scary in my head, billboards from shitty advertising campaigns have space in there)
The algorithm learns what you like and gives you more of it.
If we put a child in front of an endless supply of ice cream, they will eventually vomit.
Far to often humans do that stereotype thing where they keep shoving apples & oranges into a label that does not actually describe either fruit.
Then there is the concept of personal responsibility (and yes children don't have the capacity to do that but who should Zucks or the parents?) I wouldn't have found this if they hadn't FORCED me to keep clicking, I had no control, its not my fault.
No skippy, it is your fault.
You showed the computer what you liked, kept pursuing it, & well thats where the rabbit hole lead you.
The computer didn't know it was serving up horrible videos pretending to be kid videos, no computer can do that.
While its nice to blame the algorithm for doing it, exactly how long was your child unsupervised on YT?
How many times did you bother to check on what they were watching and clicking on next?
Yes there are bad things out there, but we can't discuss these things. We aren't using the same definitions for things & we end up bogged down in people claiming their definition is the right one (even if its stuff full of unrelated things) so everyone is talking over each other trying to win, but doing nothing towards the actual "problem".
There are people who honestly believe that a million 15 yr olds are kidnapped and trucked into the Superb Owl every year.
Anyone who tries to point out that's impossible is attacked as supporting child abuse.
And we end up with more laws that do the opposite of what was intended & makes things what much worse for the few 15 yr olds who get lured by bad actors because people are looking for a group of 100 trafficked kids.
It would be nice if people started stripping down the labels to the bare bones meaning everyone can agree on, without all the extra baggage attached, so discussions can happen with everyone using the same understandings of things.
Facts not Feels
Denotation not Connotation
[ link to this | view in chronology ]
Algorithm says, "well, you've shown interest in these topics before, so I'll presume you're still interested in these topics." This isn't all that might be there, but it's the essential gist of the matter.
So, still, it's all about reflecting the interests of the user back to the user, with or without levels of exploration and discovery of related topics.
How is this so hard for people in Congress to understand? (Oh, right... these people are grossly stupid.)
[ link to this | view in chronology ]
Like any tool an algorithm can be good or bad, well-crafted or bodgy, used for good or evil.
But the thing is, I don't care how good or bad the algorithm is. One thing in common with all algorithms, is they need some sort of data to use as input to the algotithm. And of course, the more personalized the algorithm, the more personal data it needs from me to function. That is what I care about. That is what I don't want. I don't want any system - wether private or government - to collect and analyze data on me beyond what is required to provide the services I want to receive. I don't want personalised ads. I don't want personalised news feeds. I don't want my data being collected, analysed, assessed, used, mined, collated, experimented with, used as AI training data, whether by the service itself or sold or otherwise passed onto third parties. I don't want to be the product.
[ link to this | view in chronology ]
Re:
"I don't want to be the product."
Then don't use a service where that's the business model...
[ link to this | view in chronology ]
Re: Re:
I don't ...
[ link to this | view in chronology ]
You like it (Algorithms) because your being assimilated - resistance is Futile !
[ link to this | view in chronology ]
So, experimentally, FB discovered what it is like grocery shopping at some places.
[ link to this | view in chronology ]
I'm not a Facebook user, but in general I find that "dumb is beautiful" when it comes to computers. I actually preferred the bad old days when a search engine would spit back results at you in no particular order that contained all of your search terms in some place you could control-F for them, rather than foisting off whatever they happen to want you to read with none of your search terms anywhere to be seen. Why shouldn't news be the same way? Make people type another word if that's what they want to narrow things down.
[ link to this | view in chronology ]
This doesn't surprise me, actually. Since the feed has been "ranked", it's been something of a pain to use with posts being out of chronological order- I know I've missed out on several local events that were only announced on FB, but the post didn't show up on my feed until after the event had happened.
I'm probably more active with filtering the feed when I see stuff I'm not interested in or "friends" who start sharing outright nonsense, but just having the damn thing in chronological order would really help me out instead of FB trying to guess which of my diverse interests and globally distributed contacts I want to hear from at a specific moment in time.
[ link to this | view in chronology ]
Open a private browser session
and see what youtube puts it's the front page.
That's why algorithms are shit. Because when not carefully curated they reflect the worst of human behaviour.
Well done for teaching algorithms to be reasonable, but if that's a requirement for them to work well then they are broken. By default they promote the most extreme content they can find.
[ link to this | view in chronology ]
Re: Open a private browser session
So... the worst of human behavior? Maybe you should have actually conducted this experiment before commenting, because with that run it was mostly music.
[ link to this | view in chronology ]
Re: Re: Open a private browser session
Maybe he really hates jazz music?
[ link to this | view in chronology ]
Re: Re: Open a private browser session
The grass is always greener, as has been said, but those look like wholesome and sane defaults compared with my own.
[ link to this | view in chronology ]
Re: Re: Open a private browser session
I'll just note that YouTube won't display any kind of standard default to new users, it will still filter depending on the geolocation of the IP and various other factors. Maybe he just really needs to take a good look at his neighbours?
[ link to this | view in chronology ]
Re: Open a private browser session
Were you on the front page of BitChute instead of YouTube?
[ link to this | view in chronology ]
As discussed in the unfollow plugin article, a major issue with the news feed is that Facebook automatically signs you up to follow everything and the kitchen sink, and it's a chore to manually prune that list to the things you actually want.
If that were changed, an option for an unranked newsfeed could well work better for someone who was only following a small number of trusted sources.
[ link to this | view in chronology ]
When will Facebook be renamed?
[ link to this | view in chronology ]
"They hid 50% more posts, indicating they weren’t thrilled with what they were seeing. They saw more Groups content, because Groups is one of the few places on Facebook that remains vibrant. And they saw double the amount of posts from public pages they don’t follow, often because friends commented on those pages."
Good? That's how it's supposed to work. You see a post you don't like, you indicate that you don't like it, and you stop seeing posts of that nature. You don't like a group's discourse, you leave that group. You don't want to be notified about someone commenting on someone else's page, you go to your notification settings and turn it off for that person. This gives users control over what they see and, with just a little bit of work, gives them the feed that best suits their preferences. The algorithm preempting your decision-making may be faster but it isn't more accurate. Posts people want to see get hidden from them, and posts they want to avoid keep being surfaced because the algorithm somehow decided "no, you really do want to see this, no matter how often you avoid engaging with it i know better"
[ link to this | view in chronology ]
Re:
Here's the whole quote:
This is clearly not an improvement.
That's how it works if there's an algorithm that is trying to learn your preferences. If not, then they will keep showing up because it's just a feed of everything.
That is fine, but they're also getting this: "double the amount of posts from public pages they don’t follow".
How did you come to that conclusion? This experiment shows people having a lower percentage of content they want to see in their feed when the algorithm is turned off.
How do you know?
That is what happened with no algorithm. What makes you think it is even worse with the algorithm?
[ link to this | view in chronology ]
New + Complicated = Scary x 2
That is why "the algorithm" is such a bogeyman. It is quite the shibboleth for computer illiteracy now at least. Unlike Fizzbuzz you don't even need to administer it as a coding test to tell that somebody cannot code their way out of a paper bag. If they said stupid shit about 'the algorithm' then you know that it doesn't matter if they have 11 PHDs in Computer Science, they are incompetent.
[ link to this | view in chronology ]