Facebook's Ability To Influence The Election

from the who-needs-lobbying? dept

Earlier this year, there was a lot of hype and uproar about the revelation that, back in 2012, Facebook had run an experiment on news feeds to see if it could make people happy or sad. While I really don't think the experiment was so crazy, others disagreed. Of course, that was hardly the only experiment that Facebook has run on its users, and over at Mother Jones, Micah Sifry last week revealed the details of another Facebook newsfeed experiment from 2012: one that influenced how and if people voted:
For one such experiment, conducted in the three months prior to Election Day in 2012, Facebook increased the amount of hard news stories at the top of the feeds of 1.9 million users. According to one Facebook data scientist, that change—which users were not alerted to—measurably increased civic engagement and voter turnout.
As the article notes, Facebook had experimented with "I'm Voting" or "I'm a Voter" buttons on its site to see if that would encourage friends to vote, but its civic engagement tactics have gone much further than that. Still, even all the way back in 2010, Facebook had realized that just using those "voter" buttons likely increased voting:
After the election, the study's authors examined voter records and concluded that Facebook's nudging had increased voter turnout by at least 340,000. As the study noted, that's about 0.14 percent of the total voting-age population in 2010. Considering that overall turnout rose from 37.2 percent in 2006 to 37.8 percent in 2010—both off-year, nonpresidential elections—the Facebook scientists maintained that the voter megaphone impact in 2010 was substantial. "It is possible," the Facebook team wrote in Nature, "that more of the 0.6 percent growth in turnout between 2006 and 2010 might have been caused by a single message on Facebook."
Now, for the 2012 experiment, which Facebook doesn't seem to want to talk about very much (and, in fact, it pulled a video about it, after Sifry started poking around, asking questions):
In the fall of 2012, according to two public talks given by Facebook data scientist Lada Adamic, a colleague at the company, Solomon Messing, experimented on the news feeds of 1.9 million random users. According to Adamic, Messing "tweaked" the feeds of those users so that "instead of seeing your regular news feed, if any of your friends had shared a news story, [Messing] would boost that news story so that it was up top [on your page] and you were much more likely to see it." Normally, most users will see something more personal at the top of the page, like a wedding announcement or baby pictures.

Messing's "tweak" had an effect, most strongly among occasional Facebook users. After the election, he surveyed that group and found a statistically significant increase in how much attention users said they paid to government. And, as the below chart used by Adamic in a lecture last year suggests, turnout among that group rose from a self-reported 64 percent to more than 67 percent. This means Messing's unseen intervention boosted voter turnout by 3 percent. That's a major uptick (though based only on user self-reporting).
There were also other experiments to see what types of messages (i.e., "I'm a Voter" vs. "I'm Voting") were more effective.

I'm sure that these kinds of efforts will concern some -- and there are already some people talking about "manipulating the election," but to some extent that's silly. The same is true of just about any political campaigning or "get out the vote" effort. Could there be some concern that Facebook has disproportionate power or (as the article suggests) really only helps one party (more Facebook users are Democrats)? Perhaps, but that's the nature of a (mostly) free and open society where we have democratic elections. Some percentage of the public votes, and lots of people are pushing to either get them to vote or to vote in certain ways. Facebook being a part of that seems interesting to note and to follow, but it's not necessarily a problem or something to be concerned about.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: elections, influence, manipulate, social media, voting
Companies: facebook


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    Ninja (profile), 4 Nov 2014 @ 3:36am

    I find it somewhat disturbing considering there's a good portion of the population that is very highly susceptible to manipulation (and let's be honest we are all going to be manipulated to some degree even if it's not Facebook doing it). Still what can be done about it? Even if you forbid them from doing it what would prevent Facebook from trying to manipulate something secretly?

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 4 Nov 2014 @ 4:18am

    While the manipulation is not particularly concerning, what is concerning is that people in charge of companies are deeply manipulative.

    link to this | view in thread ]

  3. identicon
    ponk head, 4 Nov 2014 @ 4:31am

    meaningless drivel

    Sorry but those figures are at best statistically just noise and trying to claim one off small percentage boosts based on facebook feeds is rubbish.

    The real story here is Why the heck is facebook messing with posts i want to read and replacing them with crap that noone ever reads?

    No wonder facebook is becoming more and more irrelevant to everybody and they are causing their own demise quicker than it should have happened.

    link to this | view in thread ]

  4. identicon
    Anonymous Coward, 4 Nov 2014 @ 5:01am

    I don't see how this isn't concerning

    Facebook could do what they did with the happy/sad manipulation - try to see if they could manipulate people towards another leaning.

    Imagine if facebook starting filtering the newsfeed to ignore certain candidates, or negative coverage of others. They could control what opinions reach the voters, just like the traditional news media has for generations. Except now everyone would think their friends are overwhelmingly endorsing a candidate.

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 4 Nov 2014 @ 5:06am

    You could mess with ButtBook by creating accounts that are always happy or always sad no matter wtf they try to do.

    Do it for science

    link to this | view in thread ]

  6. icon
    lemrechaun (profile), 4 Nov 2014 @ 5:30am

    This isn't really a concern

    This is more commonly known as marketing and research. You create test groups and provide various samples to each and measure their responses. It's more simple for facebook to do it because we voluntarily have submitted our contact information to them. They have a direct feed to our eyes. Most companies use email, postal services, surveys, and monitor their web traffic.

    link to this | view in thread ]

  7. identicon
    Anonymous Coward, 4 Nov 2014 @ 5:48am

    Re:

    Wait wut?

    Do you not 'even' human history?

    The values of Liberty are all but gone in America, people would rather have their nanny state that fondles their breasts and balls before flight, tells that what to eat, and how their health care is going to go down. We do NOT care how corrupt our politicians are so long as that corrupt bastard gives us what we want. We do not care that jury duty is our most important vote on how this nation runs nationally or locally, instead shirking responsibility and allowing the court system to rape the liberty of its citizens.

    Too many of us just do not care, and those that care fight against wave after wave of ignorance and rhetoric and cares not for truth, but instead for their special brand of dogma!

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 4 Nov 2014 @ 5:49am

    Re: This isn't really a concern

    Experiments on animals is also "research"... that doesn't make it "ok".

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 4 Nov 2014 @ 7:49am

    Re:

    While the manipulation is not particularly concerning, what is concerning is that people in charge of countries are deeply manipulative.


    FTFY

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 4 Nov 2014 @ 9:04am

    How dare people criticize Facebook. It is such a great company...

    I don't know why I wrote that above but I just felt like it after looking at my Facebook feed.

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 4 Nov 2014 @ 7:41pm

    Re: "They trust me — dumb fucks"

    It's not surprising that Facebook was founded by someone who openly bragged about how gullible and easily manipulated people were.

    "They trust me — dumb fucks."

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.