Facebook Messed With The Emotions Of 689,003 Users... For Science
from the and-maximum-creepiness dept
As you may have heard (since it appears to have become the hyped up internet story of the weekend), the Proceedings of the National Academy of Sciences (PNAS) recently published a study done by Facebook, with an assist from researchers at UCSF and Cornell, in which they directly tried (and apparently succeeded) to manipulate the emotions of 689,003 users of Facebook for a week. The participants -- without realizing they were a part of the study -- had their news feeds "manipulated" so that they showed all good news or all bad news. The idea was to see if this made the users themselves feel good or bad. Contradicting some other research which found that looking at photos of your happy friends made you sad, this research apparently found that happy stuff in your feed makes you happy. But what's got a lot of people up in arms is the other side of that coin: seeing a lot of negative stories in your feed appears to make people mad.There are, of course, many different ways to view this: and the immediate response from many is "damn, that's creepy." Even the editor of the study, admits to the Atlantic, that she found it to be questionable:
"I was concerned," she told me in a phone interview, "until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time... I understand why people have concerns. I think their beef is with Facebook, really, not the research."Law professor James Grimmelmann digs deeper into both the ethics and legality of the study and finds that there's a pretty good chance the study broke the law, beyond breaking standard research ethics practices. Many people have pointed out, as the editor above did, that because Facebook manipulates its news feed all the time, this was considered acceptable and didn't require any new consent (and Facebook's terms of service say that they may use your data for research). However, Grimmelmann isn't buying it. He points to the official government policy on research on human subjects, which has specific requirements, many of which were not met.
While those rules apply to universities and federally funded research, many people assumed that they don't apply to Facebook as a private company. Except... this research involved two universities... and it was federally funded (in part) [Update: Cornell has updated its original story that claimed federal funding to now say the study did not receive outside funding.]. The rest of Grimmelmann's rant is worth reading as well, as he lays out in great detail why he thinks this is wrong.
While I do find the whole thing creepy, and think that Facebook probably could have and should have gotten more informed consent about this, there is a big part of this that is still blurry. The lines aren't as clear as some people are making them out to be. People are correct in noting that Facebook changes their newsfeed all the time, and of course Facebook is constantly tracking how that impacts things. So there's always some "manipulation" going on -- though, usually it's to try to drive greater adoption, usage and (of course) profits. Is it really that different when it's done just to track emotional well-being?
As Chris Dixon notes, doing basic a/b testing is common for lots of sites, and he's unclear how this is all that different. Of course, many people pointed out that manipulating someone's emotions to make them feel bad is (or at least feels) different, leading him to point out that plenty of entertainment offerings (movies, video games, music) also manipulate our emotions as well -- though Dixon's colleague Benedict Evans points out that there's a sort of informed consent when you "choose" to go to see a sad movie. Though, of course, a possible counter is that there are plenty of situations in which emotions are manipulated without such consent (think: advertising). In the end, this may just come down to being about what people expect.
If anything, what I think this does is really to highlight how much Facebook manipulates the newsfeed. This is something very few people seem to think about or consider. Facebook's newsfeed system has always been something of a black box (which is a reason that I prefer Twitter's setup where you get the self-chosen firehose, rather than some algorithm (or researchers' decisions) picking what I get to see). And, thus, in the end, while Facebook may have failed to get the level of "informed consent" necessary for such a study, it may have, in turn, done a much better job of accidentally "informing" a lot more people how its newsfeeds get manipulated. Whether or not that leads more people to rely on Facebook less, well, perhaps that will be the subject of a future study...
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: consent, emotions, ethics, informed consent, newsfeed
Companies: facebook
Reader Comments
Subscribe: RSS
View by: Time | Thread
[ link to this | view in thread ]
I wonder, what if some of the people used as guinea pigs chose to make nothing public (ie: only visible to friends on their lists)? It would mean that facebook would have to actively invade their personal space since they wouldn't be able to see their messages and define if they were happier or madder without doing so. Indeed creepy.
[ link to this | view in thread ]
Control people?
Or how about societies beliefs in general such as religion or how to treat your neighbor or whether you should confront the local police, etc...
1 more reason to never use social media. Glad that train went right past me.
[ link to this | view in thread ]
The lesson?
I wonder how many people complaining about this will continue to feed their information into the site
Dan Gillmor tweeted "If you're complaining about Facebook's manipulation, remember that you probably helped make it possible by feeding your life into FB." and I agree.
[ link to this | view in thread ]
Whatever!
There is a reason that Dictators stay in power, because of the cowardice and gullibility of the people.
All that is necessary for Evil to prevail... and when the good guys do nothing in the face of Evil they might as well just be Evil themselves.
If you save the lives of those whom would enslave and murder on humanitarian grounds... then are you approving that they enslaved and slaughtered on humanitarian grounds? They stared the slaves and the slaughtered right in the eyes as they did these things to them... what of these 'tyrants' is left to save?
[ link to this | view in thread ]
...
Facebook is the greatest company in the world. They would never do anything to hurt their users and we should all commend them for allowing this kind of research to be done. It improves humanity and those of you that may have been impacted should feel lucky to have been involved.
[ link to this | view in thread ]
Been a guinee-pig before
On the one hand I was mildly annoyed that I was part of a study without my consent but I recognised that if I had been aware of the study I would have behaved differently which would rather ruin the whole thing. In the end I decided I was ok with it and actually quite liked the way it made use of information gathered from the game mechanics.
The sheer size of online games and social networks is an enormous boon for academics. It gives them a level of scale they've never really had access too before. Yes they don't have specific consent but, as noted above, if you know about it then your behaviour will be different.
For me I think I'm willing to accept a degree of manipulation in the name of research. It then becomes more of a question of to what degree it is accetable and ultimately the use to which the research will be put.
An AC noted above about voting. If the newsfeed can be manipulated based on emotions it surely can based on politics or lobbying. I can well imagine RIAA or MPAA types seeing this and wondering if this is the answer to their prayers for control... all they have to do is get Facebook on their side.
[ link to this | view in thread ]
So...
Take a look at CNN's front page. Compare the number of positive stories to negative ones. Facebook shouldn't really be blamed for something our media has done to us for decades...
[ link to this | view in thread ]
this is unreasonable
Everyone should drop Facebook as this is the sign of a nasty evil company that should die.
[ link to this | view in thread ]
Facebook Messed With The Emotions Of 689,003 Users... For Science
GLaDOS
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
wow
[ link to this | view in thread ]
hmmm
[ link to this | view in thread ]
[ link to this | view in thread ]
Re: hmmm
[ link to this | view in thread ]
Re: Re: hmmm
[ link to this | view in thread ]
Re: Been a guinee-pig before
damn.
[ link to this | view in thread ]
The study claims that Facebook's TOS is informed consent. That's nonsense. A generic line buried in the TOS stating that research may be conducted is NOT informed consent. When people see that line, if they notice it at all, they imagine a passive study of how often you click what.
And Facebook? Your usefulness is one thing: letting people see their friend's posts. If you're arbitrarily filtering those posts, then why should anyone continue to use your site?
[ link to this | view in thread ]
[ link to this | view in thread ]
Re:
Isn't FB science grand?!
[ link to this | view in thread ]
Re: Re: Re: hmmm
[ link to this | view in thread ]
Re: Re: Re: hmmm
In theory: Tested behavior - Control group behavior = Effects of the test.
[ link to this | view in thread ]
Just another internet link.
http://en.wikipedia.org/wiki/Tavistock_Institute
c:\peace.exe
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re: Been a guinee-pig before
Not me. This is yet another reason to minimize any reliance on third parties.
[ link to this | view in thread ]
With reporters, journalists, and newspapers so focused on reporting bad news (ie: "if it bleeds, it leads"), what affect is that focus going to have on society?
[ link to this | view in thread ]
Re: Re: Re: Re: hmmm
(TestedBehavior + EffectOfControlOnTestGroup) - EffectOfControlOnGrontrolGroup = ResultsOfStudy
It's almost the same as what you said, but not quite.
But often it's the best we can do.
[ link to this | view in thread ]
Even TechDirt has it's bias, although they try (much) harder to "play fair" than anyone else I've seen - which is why I do follow TechDirt!. (Keep up the good work, guys.)
As to the research - it was really rather pointless; politicians and advertisers have known about this - and used it to their advantage - forever...
[ link to this | view in thread ]
Re: Re:
We should be careful before screaming about very minor marginal effects on large numbers of people - there are a million variables that affect us every day, fiddling with just one is unlikely to cause major changes to any individual (as opposed to tiny changes across a population, visible only with statistical analysis).
Suppose I run a little personal "experiment" - one day I smile at everyone I meet; the next day I frown. I note the reactions.
Have I done something horrible? I think not.
[ link to this | view in thread ]
Re:
[ link to this | view in thread ]
Re:
I think you misunderstand... the "news" feed on Facebook isn't news, it's posts from your friends. So Facebook was not showing posts from your friends based on which group you were in and which words were contained.
[ link to this | view in thread ]
Facebook is strip-mining human society
But we need no more from Facebook than truth in labeling.
We need to no rules, no punishments, no guidelines. We need nothing but the truth.
Facebook should lean in and tell its users what it does.
It should say "We watch you every minute that you're here. We watch every detail of what you do, what you look at, who you're paying attention to, what kind of attention you're paying, what you do next, and how you feel about it based on what you search for."
We have wired the web so that we watch all the pages that you touch that aren't ours, so that we know exactly what you're reading all the time, and we correlate that with your behavior here."
To every parent Facebook should say, "Your children spend hours every day with us. Every minute of those hours, we spy upon them more efficiently than you will ever be able to."
Only that, just the truth. That will be enough.
But the crowd that runs Facebook, that small bunch of rich and powerful people, will never lean in close enough to tell you the truth.
So I ought to mention that since the last time we were together, it was revealed that Mr. Zuckerberg has spent thirty million dollars that he got from raping human society on buying up all the houses around his own in Palo Alto.
Because he needs more privacy."
-Eben Moglen on the "privacy transaction".
[ link to this | view in thread ]
Re: Re: Re:
But where does one draw the line? Suppose Facebook had selected a group of people and modified their messages with the specific experimental goal of trying to get those people to commit suicide?
Let's say it wasn't successful and no one actually did commit suicide: Would you say that, because the effects weren't major, therefore such an experiment was okay?
If you don't think so, then where does one draw the line?
[ link to this | view in thread ]
A though experiment
In cooperation with a university studying moot, they seed clouds to cause rain in area A one day, while it's sunny in area B.
Another time, they reverse - rain in B, while sunny in A.
Then measure something to gauge happiness/sadness to find out if it correlates with weather.
Would we be equally upset? Why or why not?
[ link to this | view in thread ]
Re: A though experiment
[ link to this | view in thread ]
Re: Re: Re: Re:
You draw the line at a risk of ANY harm, unless that harm was specifically disclosed to the participants and they agreed to take the risk.
[ link to this | view in thread ]
Ethics: Before and after
Suppose, without informed consent, you feed a thousand people a small dose of a poison to determine if the poison is safe. None of them get sick, none of them die. Since no "major harm" results, is the test therefore ethical? I think not.
Because ethics applies to your actions, not the result of your actions. The question isn't whether people got sick or people died. The question is: Was it ethical to give them the poison without informed consent?
Yes, Facebook routinely modifies messages, and is allowed to do so by its terms of service. It is entirely different to deliberately select 639,000 people, and deliberately experiment to see if selected modifications will help some or harm others. To me, this appears unethical, even if there were no major harms as a result. The results don't matter, Facebook's actions matter.
[ link to this | view in thread ]
Re: Ethics: Before and after
It's a bit like people using Google but then working themselves into a moral outrage that Google mines the information they give them.
[ link to this | view in thread ]
Re: Re: Re: Re:
But that just serves to illustrate that this is not a hard-and-fast thing, but something that requires judgement to decide how much is too much.
I think most people would agree that my smile/frown experiment is OK, yet that your suicide experiment is not OK.
In between is a grey area; what is acceptable becomes a matter of opinion and debate.
Personally I don't think FB went over the line, but I agree that reasonable people can differ over it.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: You draw the line at a risk of ANY harm
Surely there is SOME risk of harm from every interaction we have with other people every day - whether we smile or frown, are polite or impolite, hurried or relaxed.
I don't think zero-tolerance works here.
[ link to this | view in thread ]
Re: Re: A though experiment
But unless I misunderstood (of course possible), all they did was bias which of various legitimate postings by friends they chose to show.
It's not like they promised to show ALL the friend's postings in the first place - they've been selective all along. All they did was bias the selection criteria.
I really don't see the problem. But it does seem I'm the odd one out here.
[ link to this | view in thread ]
Re: Re: Re: A though experiment
This test changed the purpose of the selection criteria from trying to show you what interests you the most to something completely unrelated to your needs or interests.
Really, I consider this a bit of a tempest in a teapot -- I think Facebook routinely engages in greater sins, and people are foolish to trust them in the first place. But nonetheless, I think Facebook was wrong on this.
[ link to this | view in thread ]
Re: Re: Re: Re: A though experiment
But it's a very fine line - by that criterion, the same experiment would have been OK if the A/B test intent had been to find the best algorithm to make the users happier (surely that is directly related to users' interests).
[But I agree - FB does worse things than this that people don't complain much about.]
[ link to this | view in thread ]
"They trust me ... Dumb fucks."
This has always been a mystery to me. Though apparently not to Mark Zuckerberg, who years ago sized up Facebook users rather accurately, in his famously leaked chat log:
"They "trust me"
"Dumb fucks."
It's basically the same conclusion P.T. Barnum came to over a century ago, as in his famous quote "There's a sucker born every minute." (and like P.T. Barnum, Mark Zuckerberg obviously has no qualms about turning that observation into a business model)
[ link to this | view in thread ]
Re: "They trust me ... Dumb fucks."
"Dumb fucks."
First off, while that quote is inexcusable, I think it's silly that folks still point to it, as if a comment made by Mark Zuckerberg 10 years ago while he was in his dorm room has any relevance on the multinational company that Facebook has become today. That's just silly.
It's basically the same conclusion P.T. Barnum came to over a century ago, as in his famous quote "There's a sucker born every minute."
Speaking of suckers... PT Barnum never said that.
http://www.historybuff.com/library/refbarnum.html
[ link to this | view in thread ]
Re: Whatever!
--- Not everyone is like yourself.
"This is why all of those child day cares have "Share your Toys" reminders on the walls "
--- Well, there's incontrovertible evidence right there.
"Our Selfishness comes natural, too bad this shit does not go away with age"
--- Is this sarcasm or where you just reading the news on FB?
Evil doerzzz - oh my! Repent now.
[ link to this | view in thread ]
Re: So...
[ link to this | view in thread ]
Re: Whatever!
This is not true. Most people obey what they view as fair, sensible laws, not making waves, fitting in, treating others how you want to be treated. There has been much study on this, especially where sentencing guildelines are being determined.
Examples of this are the steady ratcheting up of penalties for copyright infringement (piracy!). Many people do not view the extreme maximalist copyright position as reasonable. Therefore even with the escalation in penalties, copyright infringement is increasing.
And for other laws, say murder and so on, that most people generally agree with, often increasing penalties has little to no effect because most people just don't agree with committing murder, most people just don't do it. The sort of people who do commit murder, do believe it is a viable option, will commit the murder whether the penalty is 15 years, 30 years or even execution.
err, selfishness != evil. I think you are confusing self-interest, selfishness, with evil.
Selfishness is more along the lines of not being nice, kind, etc. Just because someone is a selfish b@st@rd doesn't make them evil.
Just because I don't want to share my toys doesn't mean I'm going to go and steal someone else's.
And the reminders to share toys and so on are reminders about being nice and kind to others. To make the environment more peaceful. To make it less stressful for the staff so they don't have upset kids throwing temper tanti's because they can't get their favourite toy. They are not about good and evil.
[ link to this | view in thread ]
Re: Re: Re: Re: Re: hmmm
What you brought up is the difference between theory and reality.
[ link to this | view in thread ]
Re: Control people?
So... If the fact that it can have an influence on our opinions or emotion is a reason for us to stay away from social media, we should stop using all types of media! Is that the solution?
I think that it is easier and more practical to develop a critical mind; that it is important to look at every piece of information that is given to us and ask ourselves "Is this real?" or "Why was this information brought to me? Is someone trying to convince me of something?"
[ link to this | view in thread ]
Re: Been a guinee-pig before
That is, did they gather data from the game AS IS and use that, or did they specifically manipulate the game to test various theories?
Being part of a study where they don't manipulate the environment, just gather data from the environment, is different and less intrusive than being experimented on by being manipulated.
Also, personally, I feel there is a difference between playing a game that is supposed to manipulate you for your entertainment (e.g. questing for items, getting experience to level and become stronger, earning money to again become 'better' in some way is all a form of manipulatoin by the game designers to encourage certain activities), and participating in 'real life' social interactions that are being deliberately manipulated by a non-involved 3rd party for research.
But then again, I suppose Facebook is about as real life as Days of our lives...
[ link to this | view in thread ]
Full Disclosure
My real research study was in fact:
If the communications between an institutional review board and university researchers was slightly modified ("You should not do that" was altered to read "You should do that"), would the massive violations of professional ethics (and subsequent loss of employment and reputation) affect the mood of the researchers?
[ link to this | view in thread ]
apparently the ToS were added after the studu?
[ link to this | view in thread ]
Population studies where you unobtrusively observe a population for research is one thing. Manipulating them or their experience without their knowledge or consent to see what would happen is another. Guess which research method is the unethical one.
[ link to this | view in thread ]