Law Professor Claims Any Internet Company 'Research' On Users Without Review Board Approval Is Illegal

from the you-sure-you-want-to-go-there dept

For many years I've been a huge fan of law professor James Grimmelmann. His legal analysis on various issues is often quite valuable, and I've quoted him more than a few times. However, he's now arguing that the now infamous Facebook happiness experiment and the similarly discussed OkCupid "hook you up with someone you should hate" experiments weren't just unethical, but illegal. Grimmelmann, it should be noted, was one of the loudest voices in arguing (quite vehemently) that these experiments were horrible and dangerous, and that the academic aspect of Facebook's research violated long-standing rules.

But his new argument takes it even further, arguing not just that they were unethical, but flat out illegal, based on his reading of the Common Rule and a particular Maryland law that effectively extends the Common Rule. The Common Rule basically says that if you're doing "research involving human subjects" with federal funds, you need "informed consent" and further approval from an institutional review board (IRB), which basically all research universities have in place, who have to approve all research. The idea is to avoid seriously harmful or dangerous experiments. The Maryland law takes the Common Rule and says it applies not just to federally funded research but "all research conducted in Maryland."

To Grimmelmann, this is damning for both companies -- and basically all companies doing any research involving people in Maryland. In fact, he almost gleefully posts a letter he got back from Facebook concerning this issue and alerted the company to the Maryland law. Why so gleeful? Because Facebook's Associate General Counsel for Privacy, Edward Palmieri, repeatedly referred to what Facebook did as "research," leading Grimmelmann to play the "gotcha" card, as if that proves that Facebook's efforts were subject to that Maryland law (making it subject to the Common Rule). He further then overreacts to Palmieri, noting (accurately, in our opinion) that the Maryland law does not apply to Facebook's research as Facebook is declaring that the company "is above the law that applies to everyone else."

Except... all of that is suspect. Facebook is not claiming it is above the law that applies to everyone else. It claims that the law does not apply to it... or basically any company doing research to improve its services. Grimmelmann insists that his reading of Maryland's House Bill 917 is the only possible reading, but he may be hard pressed to find many who actually agree with that interpretation. The Common Rule's definition of "research" is fairly broad, but I don't think it's nearly as broad as Grimmelmann wants it to be. Here it is:
Research means a systematic investigation, including research development, testing and evaluation, designed to develop or contribute to generalizable knowledge.
I think it's that last bit that may be problematic for Grimmelmann. It focuses on academic research "designed to develop or contribute to generalizable knowledge." That wording, while unfortunately vague, really appears to be focused on those who are doing research for the purpose of more publicly available knowledge. And while perhaps the Facebook effort touches on that, since it eventually became published research, it still seems like a stretch. Facebook wasn't doing its research for the purpose of contributing to generalizable knowledge -- but to improve the Facebook experience. Based on that, the company also shared some of that data publicly. Similarly, OkCupid's research was to improve its own services.

But under Grimmelmann's interpretation of the law, you'd have some seriously crazy results. Basic a/b testing of different website designs could be designated as illegal research without IRB approval or informed consent. I was just reading about a service that lets you put as many headlines on a blog post as you want and it automatically rotates them, trying to optimize which one gets the best results. Would that require informed consent and an IRB? Just the fact that companies call it "research" doesn't make it research under the Common Rule definition. How about a film studio taking a survey after showing a movie. The movie manipulates the emotions of the "human subjects" and then does research on their reactions. Does that require "informed consent" and an IRB?

How about a basic taste test -- Coke or Pepsi? Which do you prefer? It's research. It's developing knowledge via "human subjects." But does anyone honestly think the law for running a taste test means that any company setting up such a taste test first needs to get an IRB to approve it? The results of Grimmelmann's interpretation of the law here are nonsensical. Grimmelmann is clearly upset about the original research, and certainly there were lots of people who felt it was creepy and potentially inappropriate. But Grimmelmann's focus on actively punishing these companies is reaching obsession levels.
For one thing, many academic journals require Common Rule compliance for everything they publish, regardless of funding source. So my colleague Leslie Meltzer Henry and I wrote a letter to the journal that published the Facebook emotional manipulation study, pointing out the obvious noncompliance. For another, nothing in Facebook’s user agreement warned users they were signing up to be test subjects. So we wrote a second letter to the Federal Trade Commission, which tends to get upset when companies’ privacy policies misrepresent things. And for yet another, researchers from universities that do take federal funding can’t just escape their own Common Rule obligations by “IRB laundering” everything through a private company. So we wrote a third letter to the federal research ethics office about the Cornell IRB’s questionable review of two Cornell researchers’ collaborations with Facebook.
And that's before the letters to Facebook and OkCupid -- and, of course, to Maryland's attorney general, Doug Gansler. Of course, if Gansler actually tried to enforce such an interpretation of the law (which is not out of the question, given how quick many attorney generals are to jump on grandstanding issues that will get headlines), it would represent a very dangerous result -- one in which very basic forms of experiments and modifications in all sorts of industries (beyond just the internet) would suddenly create a risk of law-breaking. That's a result incompatible with basic common sense. Grimmelmann's response to that seems to be "but the law is the law," but that's based entirely on his stretched interpretation of that law, one that many others would likely challenge.
Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: common rule, experiments, human subjects, informed consent, irb, james grimmelmann, maryland, research
Companies: facebook, okcupid


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Sneeje (profile), 24 Sep 2014 @ 10:16am

    First we kill...

    the lemonade stands, and next on the block is the science fair...

    link to this | view in chronology ]

    • icon
      John William Nelson (profile), 24 Sep 2014 @ 11:12am

      Re: First we kill...

      You joke about this, but I won a school science fair but could not go to the district fair because my experiment did not have IRB approval, even though I had consent forms. No lie, lol.

      link to this | view in chronology ]

  • identicon
    Applesauce, 24 Sep 2014 @ 10:22am

    So, just set up a review board.

    All Facebook needs to do is set up their own institutional review board to rubber stamp their research plans. I know from direct personal experience that the Univ. of Maryland has done exactly that. Problem solved.

    link to this | view in chronology ]

  • identicon
    pegr, 24 Sep 2014 @ 10:42am

    Disregard illegal laws

    There are obvious First Amendment issues here.

    link to this | view in chronology ]

    • icon
      Sneeje (profile), 24 Sep 2014 @ 10:55am

      Re: Disregard illegal laws

      Ummmm... you may not understand what the boundaries or intent of the first amendment is. The crux of this issue is that the research or experiments are performed on and involved others without their express consent.

      No researcher can claim protection under the first amendment for experiments that have an impact on others. Your first amendment rights end when they impinge on others' natural or constitutional rights to health and welfare.

      At the extreme, a "Dr. Mengele" cannot claim that the horrific experiments he/she performed on others without (or with) their consent are protected by the first amendment.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 24 Sep 2014 @ 11:15am

        Re: Re: Disregard illegal laws

        I would expect that FB would simply claim that they can do whatever they want with how they display feeds to users.

        Besides, using his logic, all advertising would be illegal since it's used for research into purchasing habits, and is often displayed in such a way that informed consent isn't present.

        link to this | view in chronology ]

      • identicon
        pegr, 24 Sep 2014 @ 11:19am

        Re: Re: Disregard illegal laws

        Sure, your right to swing your arm ends at my nose. But the "experiments" were only expressions protected by the 1st. If I stand on a street corner and say "sniggleblert" to everyone passing and make note of their reaction, I don't need their permission.

        Just like right now. I don't need your permission reply to your post. Far cry from Dr. Mengele! (Borderline Godwin, BTW.)

        link to this | view in chronology ]

        • icon
          Sneeje (profile), 24 Sep 2014 @ 5:05pm

          Re: Re: Re: Disregard illegal laws

          For the record, you said Godwin first.

          link to this | view in chronology ]

        • icon
          Sneeje (profile), 24 Sep 2014 @ 5:18pm

          Re: Re: Re: Disregard illegal laws

          And your exact example shows what you may be missing. The nuance here is not as stark as you describe. Observations that "end at people's noses" in public may be one thing (and still arguable), but observations of behavior around which people have an expectation of privacy are entirely different.

          You do not have unrestricted free speech rights on my physical private property, for example. In fact, I can engage in prior restraint should I so choose.

          That is the nuance being discussed here, where digital boundaries are exchanged for physical ones.

          link to this | view in chronology ]

  • identicon
    Anonymous Coward, 24 Sep 2014 @ 10:47am

    Facebook wasn't doing its research for the purpose of contributing to generalizable knowledge -- but to improve the Facebook experience.


    we'll need proof of that a lawyer can not be trusted to answer truthfully ,where his client (very rich) is concerned.

    link to this | view in chronology ]

    • icon
      W Klink (profile), 24 Sep 2014 @ 12:25pm

      Re:

      "Facebook wasn't doing its research for the purpose of contributing to generalizable knowledge..."

      Facebook collaborated with Cornell University and published a journal article published by the National Academy of Sciences.

      link to this | view in chronology ]

      • icon
        orbitalinsertion (profile), 24 Sep 2014 @ 12:53pm

        Re: Re:

        This is exactly where the problem is. We went from a mention of federally funded academic research to immediately ignoring it thereafter. Those rules are there to help keep academic research honest and safe.

        If these rules seem too much for the type of research, get the rules altered appropriately. It doesn't affect what private companies do otherwise.

        link to this | view in chronology ]

  • icon
    John William Nelson (profile), 24 Sep 2014 @ 10:59am

    No more A/B testing without IRB approval!

    Definitely disappointed in Grimmelmann's analysis here. It's rather weak and lacks perspective.

    Taking his view, simple A/B version testing of a website or landing page would violate the law unless you had an IRB approve it.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 24 Sep 2014 @ 11:27am

      Re: No more A/B testing without IRB approval!

      Easy solution: block all network traffic in/out of Maryland.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 25 Sep 2014 @ 4:36am

        Re: Re: No more A/B testing without IRB approval!

        And France, too. Don't forget France. Nutters.

        link to this | view in chronology ]

    • icon
      W Klink (profile), 24 Sep 2014 @ 12:27pm

      Re: No more A/B testing without IRB approval!

      ...if you intend to publish a research paper and use federal money (which Facebook did).

      link to this | view in chronology ]

      • icon
        Mike Masnick (profile), 24 Sep 2014 @ 12:54pm

        Re: Re: No more A/B testing without IRB approval!

        ...if you intend to publish a research paper and use federal money (which Facebook did).


        Did they use federal money? There were original claims that Cornell did, but Cornell later said it was a mistake and they didn't use federal funds.

        link to this | view in chronology ]

        • icon
          John William Nelson (profile), 24 Sep 2014 @ 2:27pm

          Re: Re: Re: No more A/B testing without IRB approval!

          Or, based on Grimmelmann's reasoning, if you do A/B Testing to folks in Maryland. Federal funds is not a requirement under the Maryland law Grimmelmann relies upon.

          Then again, you really have to look into the Code of Federal Regulations and its definitions—which are explicitly referenced in the Maryland law—to see what is covered and what is not. It's too late in a long day on an already long week for me to go trouncing through the CFR unless I'm getting paid for it, though.

          link to this | view in chronology ]

  • icon
    Mason Wheeler (profile), 24 Sep 2014 @ 11:15am

    I dunno. The parade of horribles mentioned here seems like a distraction. What OKCupid and Facebook did was nothing like UI testing or a taste test, for a few simple reasons: participants know that they're participating in research (especially in a taste test) and even in UI testing when the "study subjects" may be unaware, the test is, as a general rule of thumb, not designed to actively be detrimental to the study subjects' well-being.

    But there's a very good chance that an average, reasonably well-informed layperson would look at both of these corporate "studies" and conclude that that is exactly what they are: actively designed to inflict harm and emotional distress and measure the results.

    The appropriate question to ask in a situation like this isn't "should this be considered legal under a particular interpretation of a particular state law or not?" It's "should people who do stuff like this be prosecuted for crimes against humanity or not?"

    link to this | view in chronology ]

    • icon
      John Fenderson (profile), 24 Sep 2014 @ 11:34am

      Re:

      "actively designed to inflict harm and emotional distress and measure the results."

      I disagree with that characterization of those tests, actually. Neither of them were "designed to inflict harm and emotional distress". Further, as near as I can tell, no harm was actually inflicted.

      For the record, I find both of those tests objectionable. But there's no need to overstate the situation.

      link to this | view in chronology ]

      • icon
        Mason Wheeler (profile), 24 Sep 2014 @ 12:05pm

        Re: Re:

        I don't see it as overstating the situation, not at all.

        In Facebook's case, a trusted news source (and yes, it was trusted by its users, whether or not it should have been) deliberately fed bad news to its lab ra... ahem, sorry, to its users to measure the negative emotional impact it would have on them. And they also did the opposite--trying to manipulate people with good news and distort their emotions in a positive direction and measure its effectiveness--which may appear less creepy on the surface but is possibly even worse; do you want powerful entities knowing how to pacify you by inducing happiness when something is going wrong that you should be agitated about? This is stuff straight out of a dystopian fiction novel, becoming real before our eyes.

        And OKCupid's experiment, setting people up with dates they "would probably hate" is, if anything, even more reprehensible. Relationships gone wrong are emotionally distressing pretty much by default, at the very least, and the harm done only goes up from there. The people who ran that experiment ought to be thrown in prison, and anybody with a happy sheltered life who's never experienced domestic violence has no right to tell me otherwise. Period.

        There are some lines that should never be crossed, and Facebook and OKCupid crossed a couple of them, and I sincerely do want to see some very heavy-handed criminal prosecution for it.

        link to this | view in chronology ]

        • icon
          John Fenderson (profile), 24 Sep 2014 @ 12:57pm

          Re: Re: Re:

          I feel the need to reiterate, since I'm in the uncomfortable position of actually defending Facebook and OKCupid here to some degree, that I agree with you that those tests were awful, and if I were a user of either site, I would be no longer. You and I agree more than we disagree here.

          "deliberately fed bad news to its lab ra... ahem, sorry, to its users to measure the negative emotional impact it would have on them"

          This is a little misleading and very loaded. Facebook was not feeding anything to users that weren't in their mix to begin with. What Facebook did was to slightly increase the odds that posts that contained certain keywords, a "positive" set and a "negative" set, would make it in their main stream. They weren't looking for "good news" and "bad news", but rather the tone of the post. What they wanted to measure was not the emotional impact of this, but whether the tone of the page overall made it more or less likely that people would interact with the site.

          "setting people up with dates they "would probably hate" is, if anything, even more reprehensible. Relationships gone wrong are emotionally distressing pretty much by default, at the very least, and the harm done only goes up from there."

          This is certainly overstating. First, OKCupid doesn't set anyone up on dates at all. It suggests people for you to talk to. You interact with them online and, if it seems you get along, you might agree to go on a date.

          When OKCupid intentionally mismatched people, it wasn't committing them to anything at all, let alone dates or relationships. The worst thing that would happen is you exchange a few emails and decide you don't like each other. If you're deceived by the time a date actually happens... well, that can't possibly be OKCupid's fault.

          "who's never experienced domestic violence has no right to tell me otherwise"

          What does domestic violence have to do with this? I genuinely don't see the connection.

          link to this | view in chronology ]

          • icon
            Mason Wheeler (profile), 24 Sep 2014 @ 1:35pm

            Re: Re: Re: Re:

            What does domestic violence have to do with this? I genuinely don't see the connection.

            Have you had a mostly happy life?

            It has to do with when relationships go wrong. It's pretty much always bad, but how bad it is really varies. At the light end, there's emotional distress. Somewhere in the middle you get stuff like stalking, rape, domestic violence and murder. But the really bad effects are far worse: when you mix an abusive relationship and families with children, you get cycles of abuse and domestic problems that continue causing harm to innocents for generations.

            If OKCupid's "experiment" contributed to the formation of even one of those, which unfortunately is a very real possibility, everyone involved deserves the proverbial "lock 'em up and throw away the key" treatment.

            link to this | view in chronology ]

            • icon
              John Fenderson (profile), 24 Sep 2014 @ 2:11pm

              Re: Re: Re: Re: Re:

              Yes, of course I know what domestic violence is and that it's awful. I know it better than you might suppose.

              "If OKCupid's "experiment" contributed to the formation of even one of those"

              I don't see how there's any chance at all that it would*. That's why I was confused about the domestic violence connection.

              * I mean, aside from the fact that there's a risk that any relationship could become an abusive one, so anytime someone plays a role in connecting two people together, there's a chance that they are "contributing" to it.

              link to this | view in chronology ]

              • icon
                Mason Wheeler (profile), 24 Sep 2014 @ 2:27pm

                Re: Re: Re: Re: Re: Re:

                Because when they're deliberately connecting people together that they believe are a bad match, doesn't it stand to reason that the chance of that is significantly higher?

                link to this | view in chronology ]

                • icon
                  John Fenderson (profile), 24 Sep 2014 @ 2:58pm

                  Re: Re: Re: Re: Re: Re: Re:

                  No, I honestly don't think it does, for two reasons.

                  The first is that domestic abuse doesn't come about because of "bad matches". They come about because one or more of the people involved are broken. Given the way that OKCupid does its matching, they can't know if either of the people are broken. They only know if two people have given compatible answers to the questions.

                  The second is because everyone involved has plenty of time to determine for themselves how good of a match the other person is before dating even begins. If the other person is prone to violence and you can't determine that yourself through conversation, how is OKCupid supposed to know?

                  link to this | view in chronology ]

                • icon
                  Niall (profile), 25 Sep 2014 @ 6:10am

                  Re: Re: Re: Re: Re: Re: Re:

                  Oh come on. I hate football. Pairing me with someone who loves football is a 'bad match'. I love computer games. Pairing me with someone who hates them is a 'bad match'.

                  You are jumping several bad logic hoops to align the idea of a 'bad' match with domestic violence. People are unfortunately far more likely to end up in domestic violence situations with people they otherwise match with, as those are the relationships that last and people are reluctant to get away from.

                  link to this | view in chronology ]

  • icon
    DOlz (profile), 24 Sep 2014 @ 11:19am

    A new paradigm needed

    I don’t use Facebook, Google+, or any of their cousins. The reason is that they make their money by collecting and selling my info. Now they’ve added performing pychological experments on their users. If someone were to start a similar service didn’t collect my info and protected my privacy in exchange for a monthly fee that is something I would be interested in.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 24 Sep 2014 @ 11:47am

    Federal Funds?

    "The Common Rule basically says that if you're doing "research involving human subjects" with federal funds".
    Federal funds... Federal Funds....
    Facebook is using federal funds for the research?
    I doubt it...

    link to this | view in chronology ]

    • icon
      Mike Masnick (profile), 24 Sep 2014 @ 12:15pm

      Re: Federal Funds?

      "The Common Rule basically says that if you're doing "research involving human subjects" with federal funds".
      Federal funds... Federal Funds....
      Facebook is using federal funds for the research?
      I doubt it...


      Did you read the following sentence which explains that Maryland's law takes away the federal funds requirement?

      link to this | view in chronology ]

      • icon
        George Purcell (profile), 25 Sep 2014 @ 11:17am

        Re: Re: Federal Funds?

        Maryland can pass whatever law they want, but that extension is flatly unconstitutional. There's a reason the Common Rule specifies "Federal funds"--because the punishment is the civil loss of said funds.

        link to this | view in chronology ]

    • icon
      W Klink (profile), 24 Sep 2014 @ 12:18pm

      Re: Federal Funds?

      "Facebook is using federal funds for the research?
      I doubt it..."

      Their study was published in the "Proceedings of the National Academy of Sciences of the United States of America," so it's safe to assume that there was some federal money involved.

      link to this | view in chronology ]

  • icon
    Dave Cortright (profile), 24 Sep 2014 @ 11:56am

    Take it to the limit (one more time)

    Me: "Excuse me, I'm a tourist here and I'm looking for the Inner Harbor. Can you tell me where that is?"
    Native Marylander: "Outrageous! You are conducting research on me—a human subject—without my consent. Police!"

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 24 Sep 2014 @ 12:04pm

    Can of Worms

    It looks to me like the Maryland law opens up a can of worms.

    The key phrase in the rule is "designed ... generalizable knowledge."

    So inadvertently figuring something out is ok (designed).

    Figuring out things that are specific to you is okay (generalizable).

    That means you have to be careful how you design your experiments to try to make sure that they are specific to you.

    That shouldn't be hard for most web sites since they are only aware of what goes on on their site. So any knowledge is almost by definition not generalizable without more experimentation.

    However, a company like Google with an add network that can potentially track people across many sites has to be very careful with what they do. In fact, I'm willing to bet that some of Googles automated systems that maximize click throughs run afoul of this law. Google is constantly doing experiments to see what ads people click on so that they can display more of those ads. But the google ad network is expansive enough that any knowledge they gain is almost guaranteed to be generalizable. Further, the experiments are designed specifically to see what ads get clicked on more, so it's hard to argue that the information is gained accidentally. It would be difficult for Google to design an experiment that would be harmful, but it looks to me like Google needs an IRB, or it needs to do business differently in Maryland.

    Of course Google probably already has an IRB, it's just there for different purposes, and I doubt this would be a significant burden on Google, so I think it's something they really should do anyway.

    link to this | view in chronology ]

  • icon
    Groaker (profile), 24 Sep 2014 @ 12:21pm

    Intentionally hooking up someone with the purpose that they are expected to dislike their prospective mate, when contracted to do the opposite, is a civil tort.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 24 Sep 2014 @ 2:05pm

      Re:

      Dating site questionaires are generally civil tort magnets by that standard. Lets be reasonable, the only thing different is intention. And the intention is not to find disliked dates, but to improve their questionable questionaires.

      link to this | view in chronology ]

  • icon
    Dave Cortright (profile), 24 Sep 2014 @ 12:36pm

    I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

    Let's not forget the very valid point Randall Munroe made in XKCD when this first came out:

    "Facebook shouldn't choose what stuff they show us to conduct unethical psychological research. They should only make those decisions based on, uh... However they were doing it before. Which was probably ethical, right?"

    http://xkcd.com/1390/

    link to this | view in chronology ]

    • icon
      Mason Wheeler (profile), 24 Sep 2014 @ 1:40pm

      Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

      First, you really ought to look at the alt text to that comic, which puts it in a new perspective.

      Second, I think that it has an invalid premise to begin with. Facebook shouldn't be choosing what to show users, period. The whole point of a social network is that the content comes from other users in the network; the platform is just a platform.

      But then again, this is Facebook we're talking about. "Don't even bother trying to pretend to not be evil."

      link to this | view in chronology ]

      • icon
        Dave Cortright (profile), 24 Sep 2014 @ 1:55pm

        Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

        Of course I read the ALT text. That doesn't change anything. Facebook is a private business and can choose what it wants to show to people, how it determines that, and make changes at any point. If people don't like it, they can leave.

        Your assertion of what Facebook should or should not be doing is laughable. Who are you to determine the rules that others "should" play by? Check out any 12 step program; you are powerless. Personally I LIKE the fact that when I say "I don't want to see this", Facebook hides that post and uses that information as an input for determining which new posts to show me

        You don't like how they're doing it? Create your own social network and show us all how it's done.

        link to this | view in chronology ]

        • icon
          Mason Wheeler (profile), 24 Sep 2014 @ 2:14pm

          Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

          If people don't like it, they can leave.

          People like to trot out lines like this to excuse bad behavior by corporations. "Oh, it's not like they're the Government or anything; you can simply choose not to do business with them." But with Facebook, that's simply not true. Whether you've joined the system or not, you're still part of the system. (And you say "you can leave," but have you ever tried to close a Facebook account?)

          "They're a private business" is not and never should be an excuse to not have to follow basic codes of conduct and ethics.

          link to this | view in chronology ]

          • icon
            Dave Cortright (profile), 24 Sep 2014 @ 2:27pm

            Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

            And who gets to decide what these "basic codes of conduct and ethics" are, Mr. Mason? You? What if my concept of what is ethical doesn't match yours? You are basically advocating government regulation to some undefined social standard simple because Facebook became big and successful. See also US v. Microsoft and EU v. Google.

            Yes, I deleted my Facebook account years ago when their privacy practices were abysmal. I rejoined later when things had gotten better and because I ran for public office and it was to the benefit of my campaign to connect with voters there.

            link to this | view in chronology ]

            • icon
              Mason Wheeler (profile), 24 Sep 2014 @ 3:02pm

              Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

              ...and as if by clockwork, out comes the tired old Objectivist scare trope about "punishing success."

              Look, I know Libertarians aren't exactly what the rest of us would call "in touch with reality," but isn't that one seriously pushing things just a little too far? Attributing it to a vast faceless enemy like "The Government" is one thing, but do you actually personally know any real human being--even one--who believes that success is a thing that should be punished?

              What does need to be punished is not becoming big and successful, but becoming big and abusive. I have no problem with a large and powerful entity existing and using its powers for good; it's simply that I have no evidence that Facebook is such an entity, and plenty of evidence to the contrary. Same with Microsoft, especially 90s Microsoft!

              Being big and abusive does need to be punished, it does need to be regulated. That's the American way; it says so right in our oldest and most fundamental document, the Declaration of Independence:

              We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness. That to secure these rights, Governments are instituted among Men

              In other words, smacking down abusive entities who interfere with our rights to life, liberty and happiness is explicitly what governments are supposed to do, and I would welcome the current one doing so to Facebook.

              link to this | view in chronology ]

              • icon
                Dave Cortright (profile), 24 Sep 2014 @ 3:41pm

                Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

                And exactly how is Facebook interfering with your rights here? Did they force themselves into your home and force you to put up a profile? Go all Clockwork Orange on you and force you to view a newsfeed of cute cats and Farmville requests? Oh you poor thing. Abu Ghraib pales in comparison.

                FWIW, I do not identify as a libertarian. I for one value Net Neutrality, local fire departments, and the relative stability provided by the Federal Reserve. But as a consumer I know I have choices, and no one forces me to use Microsoft, Google, Facebook, or any other particular company's product/service.

                I am reminded of the Belgium newspapers v. Google. You sounds a lot like them in saying you want to continue using Facebook because it's valuable, yet you want it to work a specific way that coincides with your particular definition of right and wrong. Good luck with that, my stone shaping friend.

                https://www.techdirt.com/articles/20110718/16394915157/

                link to this | view in chronology ]

                • identicon
                  Anonymous Coward, 24 Sep 2014 @ 4:14pm

                  Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

                  I don't use Facebook because I don't find it valuable. I have a profile because (about once a month on average) one specific friend of mine posts stuff on there that I enjoy seeing, and doesn't post it anywhere else, but I don't consider myself a real "Facebook user", and certainly not a typical one.

                  But as I pointed out several posts ago, with their shadow profile system, the old "you're not forced to be part of their system; if you don't want to, just choose not to" excuse simply is not factually correct. We get (rightfully) outraged over the NSA spying and building secret dossiers on people, but much less so over unaccountable private corporations doing the same thing. Why is that?

                  And what's with the constant harping on "my particular definition of right and wrong"? It's not mine; it's called morality, and it's been around a whole lot longer than I have. Many people believe it was revealed by one God or another; others claim it's a human construct. But what's more important isn't where it comes from, but what it is: a code for how people should interact with each other that has stood the test of time and proven itself, over millennia, to be the foundation of a strong and stable society.

                  The history of nations, and specifically of their rise and fall, is one long chain of virtue leading to prosperity, which leads to pride and self-centeredness, which leads to corruption, rotting the society from within, which leads to one of two outcomes: either the people change their ways and fix things, or their society is overthrown and conquered/destroyed by/assimilated into a more moral one. It's one of the great patterns of the ages, and we ignore it at our own peril.

                  link to this | view in chronology ]

                  • icon
                    Dave Cortright (profile), 24 Sep 2014 @ 4:30pm

                    Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

                    Seriously? You think there is one universal definition for "morality" and that Facebook just needs to follow it? If only it were that easy. You have a very naïve viewpoint, Mason. I hope for your sake you're a Freshman in college just exploring these issues for the first time. Because there's a big, diverse world out there and I can guarantee you that the majority of it does not think or believe the same way that you do.

                    I can't control what anyone does with information on me they scrape off the internet or convince my friends into revealing about me. But I don't see how this is any more shocking or "amoral" than what Pipl, Spokeo or even Google do. I suppose you think you have some "right to be forgotten" by Facebook and any other internet site out there, amiright?

                    link to this | view in chronology ]

                    • icon
                      Mason Wheeler (profile), 24 Sep 2014 @ 4:47pm

                      Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

                      (Yeah, that was me. Somehow I got signed out while writing that comment.)

                      Guarantee all you want. If other people don't agree with something, and the thing in question is a matter of objective fact rather than a matter of opinion, then who agrees and who doesn't is irrelevant to the facts of the matter.

                      I'm a professional software developer who's been out of college for years now. I've lived all over the US and on another continent, and visited a third, and I've studied history, motivation, and human behavior pretty extensively, and the more I see, the more clear it becomes that modern radical ideas about alternative morality are neither modern, radical, nor alternative; to a one they're retreads of things that have been tried in ancient times, frequently went mainstream for a while, and then failed. Some of the ones that failed badly enough to get people killed or bring down entire civilizations with them ended up getting taboos and "thou shalt nots" attached to them.

                      What we call "traditional morality" today is what works, the distilled aggregate lab notes of thousands of years of experimentation in human civilization.

                      I suppose you think you have some "right to be forgotten" by Facebook and any other internet site out there, amiright?

                      I think that if it's wrong for a government spy agency to build secret dossiers and profiles on me, then it's equally wrong for a corporate spy agency to do so. It's really that simple.

                      link to this | view in chronology ]

                      • icon
                        Dave Cortright (profile), 25 Sep 2014 @ 11:17am

                        Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

                        I did a web search for "traditional morality" and this well-stated paragraph was one of the top hits:

                        "From time to time I hear people talking about 'traditional morality' or 'traditional values' as though it were a single thing, set in stone. They usually turn out to be talking about whatever morality or values they, themselves, accept. Calling your own values 'traditional' gives them a certain weight and authority—if whoever you're talking to doesn't have a critical mind."


                        I know there are whole communities of people who would consider a woman showing her face in a photo on Facebook to be immoral. Women covering their face in public is a tradition in their culture. Should Facebook cater to their view of traditional morality? Who resolves the conflict when the traditional morals of 2 or more cultures conflict?

                        link to this | view in chronology ]

                      • icon
                        John Fenderson (profile), 25 Sep 2014 @ 12:22pm

                        Re: Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

                        "What we call "traditional morality" today is what works"

                        What do we call "traditional morality"? This is a serious question. Outside of a tiny number of bullet points (like "don't murder people") I can't actually find a moral code that is universally agreed upon, so the only "traditional" moral codes are those that are traditional within a given social group -- and those don't really agree with each other beyond the very basic points.

                        link to this | view in chronology ]

                        • icon
                          nasch (profile), 25 Sep 2014 @ 12:39pm

                          Re: Re: Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

                          This is a serious question. Outside of a tiny number of bullet points (like "don't murder people") I can't actually find a moral code that is universally agreed upon,

                          There seem to be some groups who consider murdering people to be a moral act, too. I would guess there is literally no moral question that everyone agrees on.

                          link to this | view in chronology ]

                  • identicon
                    Anonymous Coward, 25 Sep 2014 @ 4:46am

                    Re: Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

                    'We get (rightfully) outraged over the NSA spying and building secret dossiers on people, but much less so over unaccountable private corporations doing the same thing. Why is that?'

                    Well, perhaps people are fickle creatures. For example, remember, as a group we now appear to hold our athletes to higher moral and behavioral standards than we do our government leaders. That which can get you thrown out of the NFL can oddly enough almost guarantee you re-election as a politician. Go figure.

                    link to this | view in chronology ]

                  • identicon
                    Anonymous Coward, 1 Oct 2014 @ 4:45am

                    Re: Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!

                    It is worth noting that EU law gives European citizens (even those resident outside the EU, although that makes the procedures harder) the right to force any company doing business there to delete things like shadow profiles.

                    link to this | view in chronology ]

  • icon
    Jeffrey Nonken (profile), 24 Sep 2014 @ 4:56pm

    n.b. The "Pepsi Challenge" taste test that was started in 1975 is not research, it's a marketing ploy in research's clothing. The whole point wasn't to determine which was more popular, Coke or Pepsi, but to get you thinking in terms of "Coke or Pepsi" exclusive of any other brand of cola. A false dilemma writ planet-sized. We are being manipulated.

    (Not that this detracts from your argument in any way, and could possibly reinforce it. I just thought I'd toss in a bit of tangential tinfoil hattism for anybody who wasn't feeling paranoid enough. Carry on.)

    link to this | view in chronology ]

    • icon
      nasch (profile), 24 Sep 2014 @ 8:19pm

      Re:

      The whole point wasn't to determine which was more popular, Coke or Pepsi, but to get you thinking in terms of "Coke or Pepsi" exclusive of any other brand of cola.

      I don't know if it's true, but I heard that the Pepsi was labeled "M" and the Coke "Q", because they knew that in a blind test people are more likely to choose M when the two items are actually the same.

      link to this | view in chronology ]

  • icon
    toyotabedzrock (profile), 25 Sep 2014 @ 12:01am

    Is the defining line for what common law covers crossed when there is a research paper put forth. Or perhaps the obvious internal forethought that they saw this as research.

    You need to not allow yourself to be stuck in the mental theory so often and look at the end result and what rights and individual has over themselves which supercede a right to profit.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Sep 2014 @ 10:56am

    A/B testing is little more than a popularity contest that you happen to be measuring in income/dollars, page views, new subscriptions, or similar.

    That is *very* different than what Facebook did, when implemented an experiment that attempted to *effect different moods* in their users who were *expecting* the "usual Facebook experience". If I remember correctly, they concluded they DID modify (with statistical significance) the mood of several hundred people. Their own conclusion admits the impact they had on their test subjects, and yet they still act indifferent and as if even the very *thought* of running the ethics of their experiment by a 3rd party is some triviality that is beneath them.

    For me, though, the damming thing is... the experiment WAS rather trivial. It almost certainly would have gotten approved, and they might have even gotten a waver for some of the usual before-the-experiment informed consent checks that might have invalidated the results. They could have even worked with the IRB at some existing university to setup some "web/social industry"-specific IRB that would be able to respond even faster to these kinds of experiments in the future.

    No, instead they act like spoiled narcisist that cannot abide being critiqued by others. So even though the Maryland law is kind ofa problem, they deserve getting the book thrown at them if only as a splash of cold water to their ego. Maybe then they will start to understand that *yes*, you *do* need to get a 3rd party to check over your intentions *before* you start poking at people's emotions.

    link to this | view in chronology ]

  • icon
    George Purcell (profile), 25 Sep 2014 @ 11:21am

    A completely specious argument.

    The Common Rule doesn't create a criminal penalty--indeed, if tried to do so it would be immediately struck down on First Amendment grounds. Rather, it compels action through the threat of removing Federal funds from the IHE.

    Not an IHE, not a Federal grantee, it simply does not bind. And any attempt, like Maryland's to make it do so is flatly unconstitutional.

    link to this | view in chronology ]

  • identicon
    Alan, 25 Sep 2014 @ 11:45am

    Old problem, already solved

    Facially, the definition clearly applies to what has commonly been called "basic research" and equally clearly does not apply to what has been commonly called "applied research". Concepts and practice for each are pretty well established; no need to reinvent the wheel, here.

    link to this | view in chronology ]

  • identicon
    Anonymous Cowherd, 25 Sep 2014 @ 1:04pm

    What the researcher plans on doing with the result is certainly not the deciding factor in whether a human experiment is ethical or not.

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.