Facebook's Ability To Influence The Election
from the who-needs-lobbying? dept
Earlier this year, there was a lot of hype and uproar about the revelation that, back in 2012, Facebook had run an experiment on news feeds to see if it could make people happy or sad. While I really don't think the experiment was so crazy, others disagreed. Of course, that was hardly the only experiment that Facebook has run on its users, and over at Mother Jones, Micah Sifry last week revealed the details of another Facebook newsfeed experiment from 2012: one that influenced how and if people voted:For one such experiment, conducted in the three months prior to Election Day in 2012, Facebook increased the amount of hard news stories at the top of the feeds of 1.9 million users. According to one Facebook data scientist, that change—which users were not alerted to—measurably increased civic engagement and voter turnout.As the article notes, Facebook had experimented with "I'm Voting" or "I'm a Voter" buttons on its site to see if that would encourage friends to vote, but its civic engagement tactics have gone much further than that. Still, even all the way back in 2010, Facebook had realized that just using those "voter" buttons likely increased voting:
After the election, the study's authors examined voter records and concluded that Facebook's nudging had increased voter turnout by at least 340,000. As the study noted, that's about 0.14 percent of the total voting-age population in 2010. Considering that overall turnout rose from 37.2 percent in 2006 to 37.8 percent in 2010—both off-year, nonpresidential elections—the Facebook scientists maintained that the voter megaphone impact in 2010 was substantial. "It is possible," the Facebook team wrote in Nature, "that more of the 0.6 percent growth in turnout between 2006 and 2010 might have been caused by a single message on Facebook."Now, for the 2012 experiment, which Facebook doesn't seem to want to talk about very much (and, in fact, it pulled a video about it, after Sifry started poking around, asking questions):
In the fall of 2012, according to two public talks given by Facebook data scientist Lada Adamic, a colleague at the company, Solomon Messing, experimented on the news feeds of 1.9 million random users. According to Adamic, Messing "tweaked" the feeds of those users so that "instead of seeing your regular news feed, if any of your friends had shared a news story, [Messing] would boost that news story so that it was up top [on your page] and you were much more likely to see it." Normally, most users will see something more personal at the top of the page, like a wedding announcement or baby pictures.There were also other experiments to see what types of messages (i.e., "I'm a Voter" vs. "I'm Voting") were more effective.
Messing's "tweak" had an effect, most strongly among occasional Facebook users. After the election, he surveyed that group and found a statistically significant increase in how much attention users said they paid to government. And, as the below chart used by Adamic in a lecture last year suggests, turnout among that group rose from a self-reported 64 percent to more than 67 percent. This means Messing's unseen intervention boosted voter turnout by 3 percent. That's a major uptick (though based only on user self-reporting).
I'm sure that these kinds of efforts will concern some -- and there are already some people talking about "manipulating the election," but to some extent that's silly. The same is true of just about any political campaigning or "get out the vote" effort. Could there be some concern that Facebook has disproportionate power or (as the article suggests) really only helps one party (more Facebook users are Democrats)? Perhaps, but that's the nature of a (mostly) free and open society where we have democratic elections. Some percentage of the public votes, and lots of people are pushing to either get them to vote or to vote in certain ways. Facebook being a part of that seems interesting to note and to follow, but it's not necessarily a problem or something to be concerned about.
Filed Under: elections, influence, manipulate, social media, voting
Companies: facebook