Facebook Messed With The Emotions Of 689,003 Users... For Science

from the and-maximum-creepiness dept

As you may have heard (since it appears to have become the hyped up internet story of the weekend), the Proceedings of the National Academy of Sciences (PNAS) recently published a study done by Facebook, with an assist from researchers at UCSF and Cornell, in which they directly tried (and apparently succeeded) to manipulate the emotions of 689,003 users of Facebook for a week. The participants -- without realizing they were a part of the study -- had their news feeds "manipulated" so that they showed all good news or all bad news. The idea was to see if this made the users themselves feel good or bad. Contradicting some other research which found that looking at photos of your happy friends made you sad, this research apparently found that happy stuff in your feed makes you happy. But what's got a lot of people up in arms is the other side of that coin: seeing a lot of negative stories in your feed appears to make people mad.

There are, of course, many different ways to view this: and the immediate response from many is "damn, that's creepy." Even the editor of the study, admits to the Atlantic, that she found it to be questionable:
"I was concerned," she told me in a phone interview, "until I queried the authors and they said their local institutional review board had approved it—and apparently on the grounds that Facebook apparently manipulates people's News Feeds all the time... I understand why people have concerns. I think their beef is with Facebook, really, not the research."
Law professor James Grimmelmann digs deeper into both the ethics and legality of the study and finds that there's a pretty good chance the study broke the law, beyond breaking standard research ethics practices. Many people have pointed out, as the editor above did, that because Facebook manipulates its news feed all the time, this was considered acceptable and didn't require any new consent (and Facebook's terms of service say that they may use your data for research). However, Grimmelmann isn't buying it. He points to the official government policy on research on human subjects, which has specific requirements, many of which were not met.

While those rules apply to universities and federally funded research, many people assumed that they don't apply to Facebook as a private company. Except... this research involved two universities... and it was federally funded (in part) [Update: Cornell has updated its original story that claimed federal funding to now say the study did not receive outside funding.]. The rest of Grimmelmann's rant is worth reading as well, as he lays out in great detail why he thinks this is wrong.

While I do find the whole thing creepy, and think that Facebook probably could have and should have gotten more informed consent about this, there is a big part of this that is still blurry. The lines aren't as clear as some people are making them out to be. People are correct in noting that Facebook changes their newsfeed all the time, and of course Facebook is constantly tracking how that impacts things. So there's always some "manipulation" going on -- though, usually it's to try to drive greater adoption, usage and (of course) profits. Is it really that different when it's done just to track emotional well-being?

As Chris Dixon notes, doing basic a/b testing is common for lots of sites, and he's unclear how this is all that different. Of course, many people pointed out that manipulating someone's emotions to make them feel bad is (or at least feels) different, leading him to point out that plenty of entertainment offerings (movies, video games, music) also manipulate our emotions as well -- though Dixon's colleague Benedict Evans points out that there's a sort of informed consent when you "choose" to go to see a sad movie. Though, of course, a possible counter is that there are plenty of situations in which emotions are manipulated without such consent (think: advertising). In the end, this may just come down to being about what people expect.

If anything, what I think this does is really to highlight how much Facebook manipulates the newsfeed. This is something very few people seem to think about or consider. Facebook's newsfeed system has always been something of a black box (which is a reason that I prefer Twitter's setup where you get the self-chosen firehose, rather than some algorithm (or researchers' decisions) picking what I get to see). And, thus, in the end, while Facebook may have failed to get the level of "informed consent" necessary for such a study, it may have, in turn, done a much better job of accidentally "informing" a lot more people how its newsfeeds get manipulated. Whether or not that leads more people to rely on Facebook less, well, perhaps that will be the subject of a future study...
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: consent, emotions, ethics, informed consent, newsfeed
Companies: facebook


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 30 Jun 2014 @ 5:42am

    Under either circumstances, good or bad, the study proves that you can not trust a single source for news.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 30 Jun 2014 @ 6:22am

      Re:

      Indeed. It also highlights the high percentage of FB users who do just that.

      link to this | view in chronology ]

  • icon
    Ninja (profile), 30 Jun 2014 @ 5:43am

    Cue lawsuits in 3, 2, 1...

    I wonder, what if some of the people used as guinea pigs chose to make nothing public (ie: only visible to friends on their lists)? It would mean that facebook would have to actively invade their personal space since they wouldn't be able to see their messages and define if they were happier or madder without doing so. Indeed creepy.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Jun 2014 @ 5:54am

    Control people?

    So can we assume that they will now think they can control how people vote based on how many negative/positive images/stories they show about 1 or the other political group?

    Or how about societies beliefs in general such as religion or how to treat your neighbor or whether you should confront the local police, etc...

    1 more reason to never use social media. Glad that train went right past me.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 1 Jul 2014 @ 12:14am

      Re: Control people?

      Is showing images and telling stories that are or positive or negative proper to social media ? No, it concerns all types of media and this experience could have been done based on any of them, from movies to comic books. Here, the main advantage for researchers in using social media is that the process is much faster and the study can be done on a much larger sample.
      So... If the fact that it can have an influence on our opinions or emotion is a reason for us to stay away from social media, we should stop using all types of media! Is that the solution?
      I think that it is easier and more practical to develop a critical mind; that it is important to look at every piece of information that is given to us and ask ourselves "Is this real?" or "Why was this information brought to me? Is someone trying to convince me of something?"

      link to this | view in chronology ]

  • icon
    Zakida Paul (profile), 30 Jun 2014 @ 5:55am

    The lesson?

    Don't use Facebook.

    I wonder how many people complaining about this will continue to feed their information into the site

    Dan Gillmor tweeted "If you're complaining about Facebook's manipulation, remember that you probably helped make it possible by feeding your life into FB." and I agree.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Jun 2014 @ 5:58am

    Whatever!

    This is nothing new, most people are default evil and are mindless sheep in need of a Shepard. This is why all of those child day cares have "Share your Toys" reminders on the walls and not "Be Greedy with your Toys" reminders. Our Selfishness comes natural, too bad this shit does not go away with age!

    There is a reason that Dictators stay in power, because of the cowardice and gullibility of the people.

    All that is necessary for Evil to prevail... and when the good guys do nothing in the face of Evil they might as well just be Evil themselves.

    If you save the lives of those whom would enslave and murder on humanitarian grounds... then are you approving that they enslaved and slaughtered on humanitarian grounds? They stared the slaves and the slaughtered right in the eyes as they did these things to them... what of these 'tyrants' is left to save?

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 30 Jun 2014 @ 6:57pm

      Re: Whatever!

      "most people are default evil and are mindless sheep in need of a Shepard."
      --- Not everyone is like yourself.


      "This is why all of those child day cares have "Share your Toys" reminders on the walls "
      --- Well, there's incontrovertible evidence right there.


      "Our Selfishness comes natural, too bad this shit does not go away with age"
      --- Is this sarcasm or where you just reading the news on FB?


      Evil doerzzz - oh my! Repent now.

      link to this | view in chronology ]

    • icon
      Eldakka (profile), 30 Jun 2014 @ 11:56pm

      Re: Whatever!

      most people are default evil and are mindless sheep in need of a Shepard.

      This is not true. Most people obey what they view as fair, sensible laws, not making waves, fitting in, treating others how you want to be treated. There has been much study on this, especially where sentencing guildelines are being determined.

      Examples of this are the steady ratcheting up of penalties for copyright infringement (piracy!). Many people do not view the extreme maximalist copyright position as reasonable. Therefore even with the escalation in penalties, copyright infringement is increasing.

      And for other laws, say murder and so on, that most people generally agree with, often increasing penalties has little to no effect because most people just don't agree with committing murder, most people just don't do it. The sort of people who do commit murder, do believe it is a viable option, will commit the murder whether the penalty is 15 years, 30 years or even execution.

      This is why all of those child day cares have "Share your Toys" reminders on the walls and not "Be Greedy with your Toys" reminders. Our Selfishness comes natural, too bad this shit does not go away with age!

      err, selfishness != evil. I think you are confusing self-interest, selfishness, with evil.

      Selfishness is more along the lines of not being nice, kind, etc. Just because someone is a selfish b@st@rd doesn't make them evil.

      Just because I don't want to share my toys doesn't mean I'm going to go and steal someone else's.

      And the reminders to share toys and so on are reminders about being nice and kind to others. To make the environment more peaceful. To make it less stressful for the staff so they don't have upset kids throwing temper tanti's because they can't get their favourite toy. They are not about good and evil.

      link to this | view in chronology ]

  • identicon
    Michael, 30 Jun 2014 @ 6:13am

    This is outrageous. Facebook should not be able to get away with experimenting on people this way. I'm going to look at my new feed to make sure it has not been included in this.

    ...

    Facebook is the greatest company in the world. They would never do anything to hurt their users and we should all commend them for allowing this kind of research to be done. It improves humanity and those of you that may have been impacted should feel lucky to have been involved.

    link to this | view in chronology ]

  • identicon
    Call me Al, 30 Jun 2014 @ 6:16am

    Been a guinee-pig before

    A relatively small online game I've been playing for most of a decade (www.pardus.at) was in part a psychology project for one of the creators of the game. I read through a synopsis of the paper produced and it was quite interesting to me.

    On the one hand I was mildly annoyed that I was part of a study without my consent but I recognised that if I had been aware of the study I would have behaved differently which would rather ruin the whole thing. In the end I decided I was ok with it and actually quite liked the way it made use of information gathered from the game mechanics.

    The sheer size of online games and social networks is an enormous boon for academics. It gives them a level of scale they've never really had access too before. Yes they don't have specific consent but, as noted above, if you know about it then your behaviour will be different.

    For me I think I'm willing to accept a degree of manipulation in the name of research. It then becomes more of a question of to what degree it is accetable and ultimately the use to which the research will be put.

    An AC noted above about voting. If the newsfeed can be manipulated based on emotions it surely can based on politics or lobbying. I can well imagine RIAA or MPAA types seeing this and wondering if this is the answer to their prayers for control... all they have to do is get Facebook on their side.

    link to this | view in chronology ]

    • identicon
      Call me Al, 30 Jun 2014 @ 7:12am

      Re: Been a guinee-pig before

      "guinea pig"

      damn.

      link to this | view in chronology ]

    • icon
      John Fenderson (profile), 30 Jun 2014 @ 8:11am

      Re: Been a guinee-pig before

      "For me I think I'm willing to accept a degree of manipulation in the name of research"

      Not me. This is yet another reason to minimize any reliance on third parties.

      link to this | view in chronology ]

    • icon
      Eldakka (profile), 1 Jul 2014 @ 12:27am

      Re: Been a guinee-pig before

      Was it a study or an experiment?

      That is, did they gather data from the game AS IS and use that, or did they specifically manipulate the game to test various theories?

      Being part of a study where they don't manipulate the environment, just gather data from the environment, is different and less intrusive than being experimented on by being manipulated.

      Also, personally, I feel there is a difference between playing a game that is supposed to manipulate you for your entertainment (e.g. questing for items, getting experience to level and become stronger, earning money to again become 'better' in some way is all a form of manipulatoin by the game designers to encourage certain activities), and participating in 'real life' social interactions that are being deliberately manipulated by a non-involved 3rd party for research.

      But then again, I suppose Facebook is about as real life as Days of our lives...

      link to this | view in chronology ]

  • identicon
    tomczerniawski, 30 Jun 2014 @ 6:17am

    So...

    It seems reading positive stories makes people happy, while reading negative stories makes them angry.

    Take a look at CNN's front page. Compare the number of positive stories to negative ones. Facebook shouldn't really be blamed for something our media has done to us for decades...

    link to this | view in chronology ]

  • identicon
    ponk head, 30 Jun 2014 @ 6:18am

    this is unreasonable

    This is the first time ive heard that facebook manipulates the user feeds which is the opposite of why i joined. I want to see my friends items they post, as they are my friends and i want to be in on the post as well. When i asked around apparently its common knowledge that they also dont allow all the messages from friends to appear on all the other friends news feed. Ok, why? Apparently its to make more room for advertising. So i think i will find another way to talk to my friends. I think this wont go to well for facebook and its dirty secrets and nasty ways.

    Everyone should drop Facebook as this is the sign of a nasty evil company that should die.

    link to this | view in chronology ]

  • icon
    Anonymous Howard (profile), 30 Jun 2014 @ 6:19am

    Facebook Messed With The Emotions Of 689,003 Users... For Science

    "you monster"

    GLaDOS

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Jun 2014 @ 6:33am

    wow

    sad things make you sad, happy things makes you happy,, go figure !

    link to this | view in chronology ]

  • identicon
    Bengie, 30 Jun 2014 @ 6:33am

    hmmm

    My rational side says the best test subjects are unaware a test is in progress, but my personal warning system is says "Danger! Slippery slope ahead!"

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 30 Jun 2014 @ 7:01am

      Re: hmmm

      Ever heard of double blind trials, all participants agree, but half are part of a trial and the other half are the control group.

      link to this | view in chronology ]

      • identicon
        Call me Al, 30 Jun 2014 @ 7:03am

        Re: Re: hmmm

        But you still are aware that you are in a trial and that has to have some influence on your behaviour.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 30 Jun 2014 @ 7:38am

          Re: Re: Re: hmmm

          The playing field is leveled, because all know they are part of an experiment, but only half are actual subjects, so any differences between the two groups is due to the experiment.

          link to this | view in chronology ]

        • icon
          Anonymous Howard (profile), 30 Jun 2014 @ 7:45am

          Re: Re: Re: hmmm

          That's the point of the control group: to have a baseline behavior that is influenced by the knowledge of the test but uninfluenced by the actual test.

          In theory: Tested behavior - Control group behavior = Effects of the test.

          link to this | view in chronology ]

          • icon
            OldMugwump (profile), 30 Jun 2014 @ 8:38am

            Re: Re: Re: Re: hmmm

            Almost.

            (TestedBehavior + EffectOfControlOnTestGroup) - EffectOfControlOnGrontrolGroup = ResultsOfStudy

            It's almost the same as what you said, but not quite.

            But often it's the best we can do.

            link to this | view in chronology ]

            • icon
              Anonymous Howard (profile), 1 Jul 2014 @ 12:06am

              Re: Re: Re: Re: Re: hmmm

              Indeed, that's why I wrote In theory
              What you brought up is the difference between theory and reality.

              link to this | view in chronology ]

  • icon
    That Anonymous Coward (profile), 30 Jun 2014 @ 6:56am

    But hey now they know if they put 6 positive things in a row that the drones feel better about that ad in the 7th position and will have a better feeling towards the brand. This means they can sell positive lead-in ads at a higher rate to make people like your brand more.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Jun 2014 @ 7:26am

    I think basic ethics demand that you get informed consent before you start messing with people's emotions by filtering out positive messages from their friends.

    The study claims that Facebook's TOS is informed consent. That's nonsense. A generic line buried in the TOS stating that research may be conducted is NOT informed consent. When people see that line, if they notice it at all, they imagine a passive study of how often you click what.

    And Facebook? Your usefulness is one thing: letting people see their friend's posts. If you're arbitrarily filtering those posts, then why should anyone continue to use your site?

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Jun 2014 @ 7:30am

    Is it really that different when it's done just to track emotional well-being?
    What if someone had been driven to suicide by those seven days of nonstop nightmare news? I think you'd be looking at that emotional manipulation a little differently then.

    link to this | view in chronology ]

    • icon
      That Anonymous Coward (profile), 30 Jun 2014 @ 7:37am

      Re:

      we may never know. Someone who was on the edge and sent out messages seeking help might have been graded as sad and filtered from the streams of people who might have helped.

      Isn't FB science grand?!

      link to this | view in chronology ]

      • icon
        OldMugwump (profile), 30 Jun 2014 @ 8:43am

        Re: Re:

        On the other hand, maybe somebody who was going to suicide get lots of positive messages and decided not to.

        We should be careful before screaming about very minor marginal effects on large numbers of people - there are a million variables that affect us every day, fiddling with just one is unlikely to cause major changes to any individual (as opposed to tiny changes across a population, visible only with statistical analysis).

        Suppose I run a little personal "experiment" - one day I smile at everyone I meet; the next day I frown. I note the reactions.

        Have I done something horrible? I think not.

        link to this | view in chronology ]

        • icon
          Coyne Tibbets (profile), 30 Jun 2014 @ 9:03am

          Re: Re: Re:

          Okay, maybe this was a minor effect. Maybe this was fiddling with just a few variables.

          But where does one draw the line? Suppose Facebook had selected a group of people and modified their messages with the specific experimental goal of trying to get those people to commit suicide?

          Let's say it wasn't successful and no one actually did commit suicide: Would you say that, because the effects weren't major, therefore such an experiment was okay?

          If you don't think so, then where does one draw the line?

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 30 Jun 2014 @ 9:33am

            Re: Re: Re: Re:

            where does one draw the line?


            You draw the line at a risk of ANY harm, unless that harm was specifically disclosed to the participants and they agreed to take the risk.

            link to this | view in chronology ]

            • icon
              OldMugwump (profile), 30 Jun 2014 @ 11:37am

              Re: Re: Re: Re: Re: You draw the line at a risk of ANY harm

              Well, then, what about my smile/frown experiment?

              Surely there is SOME risk of harm from every interaction we have with other people every day - whether we smile or frown, are polite or impolite, hurried or relaxed.

              I don't think zero-tolerance works here.

              link to this | view in chronology ]

          • icon
            OldMugwump (profile), 30 Jun 2014 @ 11:35am

            Re: Re: Re: Re:

            No, I'd agree that was "over the line".

            But that just serves to illustrate that this is not a hard-and-fast thing, but something that requires judgement to decide how much is too much.

            I think most people would agree that my smile/frown experiment is OK, yet that your suicide experiment is not OK.

            In between is a grey area; what is acceptable becomes a matter of opinion and debate.

            Personally I don't think FB went over the line, but I agree that reasonable people can differ over it.

            link to this | view in chronology ]

    • identicon
      Anonymous Coward, 30 Jun 2014 @ 8:06am

      Re:

      And it wasn't just to "track" it. Few would have a problem with this study if it was just tracking. They intentionally manipulated it.

      link to this | view in chronology ]

  • identicon
    Slave #74826/1581[class:4], 30 Jun 2014 @ 8:06am

    It's only wrong and disturbing because they got caught. As tech advances our humanity will be more lost than ever. Cling onto the ones you love.

    Just another internet link.
    http://en.wikipedia.org/wiki/Tavistock_Institute

    c:\peace.exe

    link to this | view in chronology ]

  • icon
    Zauber Paracelsus (profile), 30 Jun 2014 @ 8:22am

    Ignoring the creepout factor and the ethics, this really begs one question:

    With reporters, journalists, and newspapers so focused on reporting bad news (ie: "if it bleeds, it leads"), what affect is that focus going to have on society?

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 30 Jun 2014 @ 8:48am

      Re:

      It makes it easier for governments to increase their control over peoples lives.

      link to this | view in chronology ]

  • identicon
    CanadianByChoice, 30 Jun 2014 @ 8:42am

    I really don't understand the apparent outrage over this. All "news" services manipulate thier feeds; they only tell you what they (or their political affiliates) want you to hear/see. It's known as "slanting" and it's always been this way. This is why I stopped paying any attention to "news" decades ago.
    Even TechDirt has it's bias, although they try (much) harder to "play fair" than anyone else I've seen - which is why I do follow TechDirt!. (Keep up the good work, guys.)
    As to the research - it was really rather pointless; politicians and advertisers have known about this - and used it to their advantage - forever...

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 30 Jun 2014 @ 8:53am

      Re:

      All "news" services manipulate thier feeds


      I think you misunderstand... the "news" feed on Facebook isn't news, it's posts from your friends. So Facebook was not showing posts from your friends based on which group you were in and which words were contained.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Jun 2014 @ 8:56am

    Facebook is strip-mining human society

    "The idea of social sharing, in a context in which the service provider reads everything and watches everybody watch, is inherently unethical.

    But we need no more from Facebook than truth in labeling.

    We need to no rules, no punishments, no guidelines. We need nothing but the truth.

    Facebook should lean in and tell its users what it does.

    It should say "We watch you every minute that you're here. We watch every detail of what you do, what you look at, who you're paying attention to, what kind of attention you're paying, what you do next, and how you feel about it based on what you search for."

    We have wired the web so that we watch all the pages that you touch that aren't ours, so that we know exactly what you're reading all the time, and we correlate that with your behavior here."

    To every parent Facebook should say, "Your children spend hours every day with us. Every minute of those hours, we spy upon them more efficiently than you will ever be able to."

    Only that, just the truth. That will be enough.

    But the crowd that runs Facebook, that small bunch of rich and powerful people, will never lean in close enough to tell you the truth.

    So I ought to mention that since the last time we were together, it was revealed that Mr. Zuckerberg has spent thirty million dollars that he got from raping human society on buying up all the houses around his own in Palo Alto.

    Because he needs more privacy."

    -Eben Moglen on the "privacy transaction".

    link to this | view in chronology ]

  • icon
    OldMugwump (profile), 30 Jun 2014 @ 9:07am

    A though experiment

    Suppose instead of Facebook it was the National Weather Service.

    In cooperation with a university studying moot, they seed clouds to cause rain in area A one day, while it's sunny in area B.

    Another time, they reverse - rain in B, while sunny in A.

    Then measure something to gauge happiness/sadness to find out if it correlates with weather.

    Would we be equally upset? Why or why not?

    link to this | view in chronology ]

    • icon
      John Fenderson (profile), 30 Jun 2014 @ 9:21am

      Re: A though experiment

      Of course not. Your NWS example involves manipulating the weather. The Facebook action involves deception in regards to information about your friends. The latter is much more sensitive and personal.

      link to this | view in chronology ]

      • icon
        OldMugwump (profile), 30 Jun 2014 @ 11:59am

        Re: Re: A though experiment

        If there was actual deception involved, I'd agree with you - totally unacceptable.

        But unless I misunderstood (of course possible), all they did was bias which of various legitimate postings by friends they chose to show.

        It's not like they promised to show ALL the friend's postings in the first place - they've been selective all along. All they did was bias the selection criteria.

        I really don't see the problem. But it does seem I'm the odd one out here.

        link to this | view in chronology ]

        • icon
          John Fenderson (profile), 30 Jun 2014 @ 12:32pm

          Re: Re: Re: A though experiment

          The deception was not that they were being selective, it's that they deviated rather severely from their stated selection criteria. What they'd said about the selection criteria was that it was based on things that were intended to try and predict the postings that would be of most use to you, based on things like which posts you liked, which people you comment the most on, etc.

          This test changed the purpose of the selection criteria from trying to show you what interests you the most to something completely unrelated to your needs or interests.

          Really, I consider this a bit of a tempest in a teapot -- I think Facebook routinely engages in greater sins, and people are foolish to trust them in the first place. But nonetheless, I think Facebook was wrong on this.

          link to this | view in chronology ]

          • icon
            OldMugwump (profile), 30 Jun 2014 @ 1:01pm

            Re: Re: Re: Re: A though experiment

            OK, I think you've convinced me. This was wrong because it was intended to produce science instead of benefit the users.

            But it's a very fine line - by that criterion, the same experiment would have been OK if the A/B test intent had been to find the best algorithm to make the users happier (surely that is directly related to users' interests).

            [But I agree - FB does worse things than this that people don't complain much about.]

            link to this | view in chronology ]

  • icon
    Coyne Tibbets (profile), 30 Jun 2014 @ 9:34am

    Ethics: Before and after

    I find myself amazed by the apologists. Paraphrased: "Facebook's terms of service allow modification of messages. No one was hurt. So they are in the clear."

    Suppose, without informed consent, you feed a thousand people a small dose of a poison to determine if the poison is safe. None of them get sick, none of them die. Since no "major harm" results, is the test therefore ethical? I think not.

    Because ethics applies to your actions, not the result of your actions. The question isn't whether people got sick or people died. The question is: Was it ethical to give them the poison without informed consent?

    Yes, Facebook routinely modifies messages, and is allowed to do so by its terms of service. It is entirely different to deliberately select 639,000 people, and deliberately experiment to see if selected modifications will help some or harm others. To me, this appears unethical, even if there were no major harms as a result. The results don't matter, Facebook's actions matter.

    link to this | view in chronology ]

    • icon
      John Fenderson (profile), 30 Jun 2014 @ 10:05am

      Re: Ethics: Before and after

      I'm certainly not a Facebook apologist, but there is a certain measure of "what did you expect?" about all of this. Facebook has a strong track record of being untrustworthy with regards to how they handle your personal information. I do have a hard time seeing how anyone who continues to use Facebook has some kind of moral high ground here.

      It's a bit like people using Google but then working themselves into a moral outrage that Google mines the information they give them.

      link to this | view in chronology ]

  • identicon
    zip, 30 Jun 2014 @ 1:19pm

    "They trust me ... Dumb fucks."

    It seems that no matter how many times Facebook gets caught flagrantly breaking its promises or otherwise doing something underhanded or unethical, the vast majority of Facebook users will remain loyal.

    This has always been a mystery to me. Though apparently not to Mark Zuckerberg, who years ago sized up Facebook users rather accurately, in his famously leaked chat log:

    "They "trust me"
    "Dumb fucks."

    It's basically the same conclusion P.T. Barnum came to over a century ago, as in his famous quote "There's a sucker born every minute." (and like P.T. Barnum, Mark Zuckerberg obviously has no qualms about turning that observation into a business model)

    link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 30 Jun 2014 @ 4:50pm

      Re: "They trust me ... Dumb fucks."

      "They "trust me"
      "Dumb fucks."


      First off, while that quote is inexcusable, I think it's silly that folks still point to it, as if a comment made by Mark Zuckerberg 10 years ago while he was in his dorm room has any relevance on the multinational company that Facebook has become today. That's just silly.

      It's basically the same conclusion P.T. Barnum came to over a century ago, as in his famous quote "There's a sucker born every minute."

      Speaking of suckers... PT Barnum never said that.

      http://www.historybuff.com/library/refbarnum.html

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 1 Jul 2014 @ 2:02am

    Full Disclosure

    The mood experiment on FB was not the actual study, although I still must apologize for any distress caused to third parties.

    My real research study was in fact:

    If the communications between an institutional review board and university researchers was slightly modified ("You should not do that" was altered to read "You should do that"), would the massive violations of professional ethics (and subsequent loss of employment and reputation) affect the mood of the researchers?

    link to this | view in chronology ]

  • identicon
    Zonker, 2 Jul 2014 @ 11:45am

    I wonder how many people got "unfriended" for too many negative posts on the feed while their positive posts were hidden. How many people upset that everybody ignored the secretly hidden posts about their parents passing away but everyone liked their cat pictures. How many people thought their friends we blocking them because they no longer see any of their posts at all.

    Population studies where you unobtrusively observe a population for research is one thing. Manipulating them or their experience without their knowledge or consent to see what would happen is another. Guess which research method is the unethical one.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.