Law Professor Claims Any Internet Company 'Research' On Users Without Review Board Approval Is Illegal
from the you-sure-you-want-to-go-there dept
For many years I've been a huge fan of law professor James Grimmelmann. His legal analysis on various issues is often quite valuable, and I've quoted him more than a few times. However, he's now arguing that the now infamous Facebook happiness experiment and the similarly discussed OkCupid "hook you up with someone you should hate" experiments weren't just unethical, but illegal. Grimmelmann, it should be noted, was one of the loudest voices in arguing (quite vehemently) that these experiments were horrible and dangerous, and that the academic aspect of Facebook's research violated long-standing rules.But his new argument takes it even further, arguing not just that they were unethical, but flat out illegal, based on his reading of the Common Rule and a particular Maryland law that effectively extends the Common Rule. The Common Rule basically says that if you're doing "research involving human subjects" with federal funds, you need "informed consent" and further approval from an institutional review board (IRB), which basically all research universities have in place, who have to approve all research. The idea is to avoid seriously harmful or dangerous experiments. The Maryland law takes the Common Rule and says it applies not just to federally funded research but "all research conducted in Maryland."
To Grimmelmann, this is damning for both companies -- and basically all companies doing any research involving people in Maryland. In fact, he almost gleefully posts a letter he got back from Facebook concerning this issue and alerted the company to the Maryland law. Why so gleeful? Because Facebook's Associate General Counsel for Privacy, Edward Palmieri, repeatedly referred to what Facebook did as "research," leading Grimmelmann to play the "gotcha" card, as if that proves that Facebook's efforts were subject to that Maryland law (making it subject to the Common Rule). He further then overreacts to Palmieri, noting (accurately, in our opinion) that the Maryland law does not apply to Facebook's research as Facebook is declaring that the company "is above the law that applies to everyone else."
Except... all of that is suspect. Facebook is not claiming it is above the law that applies to everyone else. It claims that the law does not apply to it... or basically any company doing research to improve its services. Grimmelmann insists that his reading of Maryland's House Bill 917 is the only possible reading, but he may be hard pressed to find many who actually agree with that interpretation. The Common Rule's definition of "research" is fairly broad, but I don't think it's nearly as broad as Grimmelmann wants it to be. Here it is:
Research means a systematic investigation, including research development, testing and evaluation, designed to develop or contribute to generalizable knowledge.I think it's that last bit that may be problematic for Grimmelmann. It focuses on academic research "designed to develop or contribute to generalizable knowledge." That wording, while unfortunately vague, really appears to be focused on those who are doing research for the purpose of more publicly available knowledge. And while perhaps the Facebook effort touches on that, since it eventually became published research, it still seems like a stretch. Facebook wasn't doing its research for the purpose of contributing to generalizable knowledge -- but to improve the Facebook experience. Based on that, the company also shared some of that data publicly. Similarly, OkCupid's research was to improve its own services.
But under Grimmelmann's interpretation of the law, you'd have some seriously crazy results. Basic a/b testing of different website designs could be designated as illegal research without IRB approval or informed consent. I was just reading about a service that lets you put as many headlines on a blog post as you want and it automatically rotates them, trying to optimize which one gets the best results. Would that require informed consent and an IRB? Just the fact that companies call it "research" doesn't make it research under the Common Rule definition. How about a film studio taking a survey after showing a movie. The movie manipulates the emotions of the "human subjects" and then does research on their reactions. Does that require "informed consent" and an IRB?
How about a basic taste test -- Coke or Pepsi? Which do you prefer? It's research. It's developing knowledge via "human subjects." But does anyone honestly think the law for running a taste test means that any company setting up such a taste test first needs to get an IRB to approve it? The results of Grimmelmann's interpretation of the law here are nonsensical. Grimmelmann is clearly upset about the original research, and certainly there were lots of people who felt it was creepy and potentially inappropriate. But Grimmelmann's focus on actively punishing these companies is reaching obsession levels.
For one thing, many academic journals require Common Rule compliance for everything they publish, regardless of funding source. So my colleague Leslie Meltzer Henry and I wrote a letter to the journal that published the Facebook emotional manipulation study, pointing out the obvious noncompliance. For another, nothing in Facebook’s user agreement warned users they were signing up to be test subjects. So we wrote a second letter to the Federal Trade Commission, which tends to get upset when companies’ privacy policies misrepresent things. And for yet another, researchers from universities that do take federal funding can’t just escape their own Common Rule obligations by “IRB laundering” everything through a private company. So we wrote a third letter to the federal research ethics office about the Cornell IRB’s questionable review of two Cornell researchers’ collaborations with Facebook.And that's before the letters to Facebook and OkCupid -- and, of course, to Maryland's attorney general, Doug Gansler. Of course, if Gansler actually tried to enforce such an interpretation of the law (which is not out of the question, given how quick many attorney generals are to jump on grandstanding issues that will get headlines), it would represent a very dangerous result -- one in which very basic forms of experiments and modifications in all sorts of industries (beyond just the internet) would suddenly create a risk of law-breaking. That's a result incompatible with basic common sense. Grimmelmann's response to that seems to be "but the law is the law," but that's based entirely on his stretched interpretation of that law, one that many others would likely challenge.
Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.
Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.
While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.
–The Techdirt Team
Filed Under: common rule, experiments, human subjects, informed consent, irb, james grimmelmann, maryland, research
Companies: facebook, okcupid
Reader Comments
Subscribe: RSS
View by: Time | Thread
First we kill...
[ link to this | view in chronology ]
Re: First we kill...
[ link to this | view in chronology ]
Re: Re: First we kill...
[ link to this | view in chronology ]
So, just set up a review board.
[ link to this | view in chronology ]
Disregard illegal laws
[ link to this | view in chronology ]
Re: Disregard illegal laws
No researcher can claim protection under the first amendment for experiments that have an impact on others. Your first amendment rights end when they impinge on others' natural or constitutional rights to health and welfare.
At the extreme, a "Dr. Mengele" cannot claim that the horrific experiments he/she performed on others without (or with) their consent are protected by the first amendment.
[ link to this | view in chronology ]
Re: Re: Disregard illegal laws
Besides, using his logic, all advertising would be illegal since it's used for research into purchasing habits, and is often displayed in such a way that informed consent isn't present.
[ link to this | view in chronology ]
Re: Re: Disregard illegal laws
Just like right now. I don't need your permission reply to your post. Far cry from Dr. Mengele! (Borderline Godwin, BTW.)
[ link to this | view in chronology ]
Re: Re: Re: Disregard illegal laws
[ link to this | view in chronology ]
Re: Re: Re: Disregard illegal laws
You do not have unrestricted free speech rights on my physical private property, for example. In fact, I can engage in prior restraint should I so choose.
That is the nuance being discussed here, where digital boundaries are exchanged for physical ones.
[ link to this | view in chronology ]
we'll need proof of that a lawyer can not be trusted to answer truthfully ,where his client (very rich) is concerned.
[ link to this | view in chronology ]
Re:
Facebook collaborated with Cornell University and published a journal article published by the National Academy of Sciences.
[ link to this | view in chronology ]
Re: Re:
If these rules seem too much for the type of research, get the rules altered appropriately. It doesn't affect what private companies do otherwise.
[ link to this | view in chronology ]
No more A/B testing without IRB approval!
Taking his view, simple A/B version testing of a website or landing page would violate the law unless you had an IRB approve it.
[ link to this | view in chronology ]
Re: No more A/B testing without IRB approval!
[ link to this | view in chronology ]
Re: Re: No more A/B testing without IRB approval!
[ link to this | view in chronology ]
Re: No more A/B testing without IRB approval!
[ link to this | view in chronology ]
Re: Re: No more A/B testing without IRB approval!
Did they use federal money? There were original claims that Cornell did, but Cornell later said it was a mistake and they didn't use federal funds.
[ link to this | view in chronology ]
Re: Re: Re: No more A/B testing without IRB approval!
Then again, you really have to look into the Code of Federal Regulations and its definitions—which are explicitly referenced in the Maryland law—to see what is covered and what is not. It's too late in a long day on an already long week for me to go trouncing through the CFR unless I'm getting paid for it, though.
[ link to this | view in chronology ]
But there's a very good chance that an average, reasonably well-informed layperson would look at both of these corporate "studies" and conclude that that is exactly what they are: actively designed to inflict harm and emotional distress and measure the results.
The appropriate question to ask in a situation like this isn't "should this be considered legal under a particular interpretation of a particular state law or not?" It's "should people who do stuff like this be prosecuted for crimes against humanity or not?"
[ link to this | view in chronology ]
Re:
I disagree with that characterization of those tests, actually. Neither of them were "designed to inflict harm and emotional distress". Further, as near as I can tell, no harm was actually inflicted.
For the record, I find both of those tests objectionable. But there's no need to overstate the situation.
[ link to this | view in chronology ]
Re: Re:
In Facebook's case, a trusted news source (and yes, it was trusted by its users, whether or not it should have been) deliberately fed bad news to its lab ra... ahem, sorry, to its users to measure the negative emotional impact it would have on them. And they also did the opposite--trying to manipulate people with good news and distort their emotions in a positive direction and measure its effectiveness--which may appear less creepy on the surface but is possibly even worse; do you want powerful entities knowing how to pacify you by inducing happiness when something is going wrong that you should be agitated about? This is stuff straight out of a dystopian fiction novel, becoming real before our eyes.
And OKCupid's experiment, setting people up with dates they "would probably hate" is, if anything, even more reprehensible. Relationships gone wrong are emotionally distressing pretty much by default, at the very least, and the harm done only goes up from there. The people who ran that experiment ought to be thrown in prison, and anybody with a happy sheltered life who's never experienced domestic violence has no right to tell me otherwise. Period.
There are some lines that should never be crossed, and Facebook and OKCupid crossed a couple of them, and I sincerely do want to see some very heavy-handed criminal prosecution for it.
[ link to this | view in chronology ]
Re: Re: Re:
"deliberately fed bad news to its lab ra... ahem, sorry, to its users to measure the negative emotional impact it would have on them"
This is a little misleading and very loaded. Facebook was not feeding anything to users that weren't in their mix to begin with. What Facebook did was to slightly increase the odds that posts that contained certain keywords, a "positive" set and a "negative" set, would make it in their main stream. They weren't looking for "good news" and "bad news", but rather the tone of the post. What they wanted to measure was not the emotional impact of this, but whether the tone of the page overall made it more or less likely that people would interact with the site.
"setting people up with dates they "would probably hate" is, if anything, even more reprehensible. Relationships gone wrong are emotionally distressing pretty much by default, at the very least, and the harm done only goes up from there."
This is certainly overstating. First, OKCupid doesn't set anyone up on dates at all. It suggests people for you to talk to. You interact with them online and, if it seems you get along, you might agree to go on a date.
When OKCupid intentionally mismatched people, it wasn't committing them to anything at all, let alone dates or relationships. The worst thing that would happen is you exchange a few emails and decide you don't like each other. If you're deceived by the time a date actually happens... well, that can't possibly be OKCupid's fault.
"who's never experienced domestic violence has no right to tell me otherwise"
What does domestic violence have to do with this? I genuinely don't see the connection.
[ link to this | view in chronology ]
Re: Re: Re: Re:
Have you had a mostly happy life?
It has to do with when relationships go wrong. It's pretty much always bad, but how bad it is really varies. At the light end, there's emotional distress. Somewhere in the middle you get stuff like stalking, rape, domestic violence and murder. But the really bad effects are far worse: when you mix an abusive relationship and families with children, you get cycles of abuse and domestic problems that continue causing harm to innocents for generations.
If OKCupid's "experiment" contributed to the formation of even one of those, which unfortunately is a very real possibility, everyone involved deserves the proverbial "lock 'em up and throw away the key" treatment.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re:
"If OKCupid's "experiment" contributed to the formation of even one of those"
I don't see how there's any chance at all that it would*. That's why I was confused about the domestic violence connection.
* I mean, aside from the fact that there's a risk that any relationship could become an abusive one, so anytime someone plays a role in connecting two people together, there's a chance that they are "contributing" to it.
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re:
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re:
The first is that domestic abuse doesn't come about because of "bad matches". They come about because one or more of the people involved are broken. Given the way that OKCupid does its matching, they can't know if either of the people are broken. They only know if two people have given compatible answers to the questions.
The second is because everyone involved has plenty of time to determine for themselves how good of a match the other person is before dating even begins. If the other person is prone to violence and you can't determine that yourself through conversation, how is OKCupid supposed to know?
[ link to this | view in chronology ]
Re: Re: Re: Re: Re: Re: Re:
You are jumping several bad logic hoops to align the idea of a 'bad' match with domestic violence. People are unfortunately far more likely to end up in domestic violence situations with people they otherwise match with, as those are the relationships that last and people are reluctant to get away from.
[ link to this | view in chronology ]
A new paradigm needed
[ link to this | view in chronology ]
Federal Funds?
Federal funds... Federal Funds....
Facebook is using federal funds for the research?
I doubt it...
[ link to this | view in chronology ]
Re: Federal Funds?
Federal funds... Federal Funds....
Facebook is using federal funds for the research?
I doubt it...
Did you read the following sentence which explains that Maryland's law takes away the federal funds requirement?
[ link to this | view in chronology ]
Re: Re: Federal Funds?
[ link to this | view in chronology ]
Re: Federal Funds?
I doubt it..."
Their study was published in the "Proceedings of the National Academy of Sciences of the United States of America," so it's safe to assume that there was some federal money involved.
[ link to this | view in chronology ]
Re: Re: Federal Funds?
[ link to this | view in chronology ]
Take it to the limit (one more time)
Native Marylander: "Outrageous! You are conducting research on me—a human subject—without my consent. Police!"
[ link to this | view in chronology ]
Can of Worms
The key phrase in the rule is "designed ... generalizable knowledge."
So inadvertently figuring something out is ok (designed).
Figuring out things that are specific to you is okay (generalizable).
That means you have to be careful how you design your experiments to try to make sure that they are specific to you.
That shouldn't be hard for most web sites since they are only aware of what goes on on their site. So any knowledge is almost by definition not generalizable without more experimentation.
However, a company like Google with an add network that can potentially track people across many sites has to be very careful with what they do. In fact, I'm willing to bet that some of Googles automated systems that maximize click throughs run afoul of this law. Google is constantly doing experiments to see what ads people click on so that they can display more of those ads. But the google ad network is expansive enough that any knowledge they gain is almost guaranteed to be generalizable. Further, the experiments are designed specifically to see what ads get clicked on more, so it's hard to argue that the information is gained accidentally. It would be difficult for Google to design an experiment that would be harmful, but it looks to me like Google needs an IRB, or it needs to do business differently in Maryland.
Of course Google probably already has an IRB, it's just there for different purposes, and I doubt this would be a significant burden on Google, so I think it's something they really should do anyway.
[ link to this | view in chronology ]
[ link to this | view in chronology ]
Re:
[ link to this | view in chronology ]
I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
"Facebook shouldn't choose what stuff they show us to conduct unethical psychological research. They should only make those decisions based on, uh... However they were doing it before. Which was probably ethical, right?"
http://xkcd.com/1390/
[ link to this | view in chronology ]
Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
Second, I think that it has an invalid premise to begin with. Facebook shouldn't be choosing what to show users, period. The whole point of a social network is that the content comes from other users in the network; the platform is just a platform.
But then again, this is Facebook we're talking about. "Don't even bother trying to pretend to not be evil."
[ link to this | view in chronology ]
Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
Your assertion of what Facebook should or should not be doing is laughable. Who are you to determine the rules that others "should" play by? Check out any 12 step program; you are powerless. Personally I LIKE the fact that when I say "I don't want to see this", Facebook hides that post and uses that information as an input for determining which new posts to show me
You don't like how they're doing it? Create your own social network and show us all how it's done.
[ link to this | view in chronology ]
Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
People like to trot out lines like this to excuse bad behavior by corporations. "Oh, it's not like they're the Government or anything; you can simply choose not to do business with them." But with Facebook, that's simply not true. Whether you've joined the system or not, you're still part of the system. (And you say "you can leave," but have you ever tried to close a Facebook account?)
"They're a private business" is not and never should be an excuse to not have to follow basic codes of conduct and ethics.
[ link to this | view in chronology ]
Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
Yes, I deleted my Facebook account years ago when their privacy practices were abysmal. I rejoined later when things had gotten better and because I ran for public office and it was to the benefit of my campaign to connect with voters there.
[ link to this | view in chronology ]
Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
Look, I know Libertarians aren't exactly what the rest of us would call "in touch with reality," but isn't that one seriously pushing things just a little too far? Attributing it to a vast faceless enemy like "The Government" is one thing, but do you actually personally know any real human being--even one--who believes that success is a thing that should be punished?
What does need to be punished is not becoming big and successful, but becoming big and abusive. I have no problem with a large and powerful entity existing and using its powers for good; it's simply that I have no evidence that Facebook is such an entity, and plenty of evidence to the contrary. Same with Microsoft, especially 90s Microsoft!
Being big and abusive does need to be punished, it does need to be regulated. That's the American way; it says so right in our oldest and most fundamental document, the Declaration of Independence:
In other words, smacking down abusive entities who interfere with our rights to life, liberty and happiness is explicitly what governments are supposed to do, and I would welcome the current one doing so to Facebook.
[ link to this | view in chronology ]
Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
FWIW, I do not identify as a libertarian. I for one value Net Neutrality, local fire departments, and the relative stability provided by the Federal Reserve. But as a consumer I know I have choices, and no one forces me to use Microsoft, Google, Facebook, or any other particular company's product/service.
I am reminded of the Belgium newspapers v. Google. You sounds a lot like them in saying you want to continue using Facebook because it's valuable, yet you want it to work a specific way that coincides with your particular definition of right and wrong. Good luck with that, my stone shaping friend.
https://www.techdirt.com/articles/20110718/16394915157/
[ link to this | view in chronology ]
Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
But as I pointed out several posts ago, with their shadow profile system, the old "you're not forced to be part of their system; if you don't want to, just choose not to" excuse simply is not factually correct. We get (rightfully) outraged over the NSA spying and building secret dossiers on people, but much less so over unaccountable private corporations doing the same thing. Why is that?
And what's with the constant harping on "my particular definition of right and wrong"? It's not mine; it's called morality, and it's been around a whole lot longer than I have. Many people believe it was revealed by one God or another; others claim it's a human construct. But what's more important isn't where it comes from, but what it is: a code for how people should interact with each other that has stood the test of time and proven itself, over millennia, to be the foundation of a strong and stable society.
The history of nations, and specifically of their rise and fall, is one long chain of virtue leading to prosperity, which leads to pride and self-centeredness, which leads to corruption, rotting the society from within, which leads to one of two outcomes: either the people change their ways and fix things, or their society is overthrown and conquered/destroyed by/assimilated into a more moral one. It's one of the great patterns of the ages, and we ignore it at our own peril.
[ link to this | view in chronology ]
Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
I can't control what anyone does with information on me they scrape off the internet or convince my friends into revealing about me. But I don't see how this is any more shocking or "amoral" than what Pipl, Spokeo or even Google do. I suppose you think you have some "right to be forgotten" by Facebook and any other internet site out there, amiright?
[ link to this | view in chronology ]
Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
Guarantee all you want. If other people don't agree with something, and the thing in question is a matter of objective fact rather than a matter of opinion, then who agrees and who doesn't is irrelevant to the facts of the matter.
I'm a professional software developer who's been out of college for years now. I've lived all over the US and on another continent, and visited a third, and I've studied history, motivation, and human behavior pretty extensively, and the more I see, the more clear it becomes that modern radical ideas about alternative morality are neither modern, radical, nor alternative; to a one they're retreads of things that have been tried in ancient times, frequently went mainstream for a while, and then failed. Some of the ones that failed badly enough to get people killed or bring down entire civilizations with them ended up getting taboos and "thou shalt nots" attached to them.
What we call "traditional morality" today is what works, the distilled aggregate lab notes of thousands of years of experimentation in human civilization.
I think that if it's wrong for a government spy agency to build secret dossiers and profiles on me, then it's equally wrong for a corporate spy agency to do so. It's really that simple.
[ link to this | view in chronology ]
Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
I know there are whole communities of people who would consider a woman showing her face in a photo on Facebook to be immoral. Women covering their face in public is a tradition in their culture. Should Facebook cater to their view of traditional morality? Who resolves the conflict when the traditional morals of 2 or more cultures conflict?
[ link to this | view in chronology ]
Re: Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
What do we call "traditional morality"? This is a serious question. Outside of a tiny number of bullet points (like "don't murder people") I can't actually find a moral code that is universally agreed upon, so the only "traditional" moral codes are those that are traditional within a given social group -- and those don't really agree with each other beyond the very basic points.
[ link to this | view in chronology ]
Re: Re: Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
There seem to be some groups who consider murdering people to be a moral act, too. I would guess there is literally no moral question that everyone agrees on.
[ link to this | view in chronology ]
Re: Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
Well, perhaps people are fickle creatures. For example, remember, as a group we now appear to hold our athletes to higher moral and behavioral standards than we do our government leaders. That which can get you thrown out of the NFL can oddly enough almost guarantee you re-election as a politician. Go figure.
[ link to this | view in chronology ]
Re: Re: Re: I am outraged something I don't really understand changed for a small sub-population that probably doesn't include me!
[ link to this | view in chronology ]
(Not that this detracts from your argument in any way, and could possibly reinforce it. I just thought I'd toss in a bit of tangential tinfoil hattism for anybody who wasn't feeling paranoid enough. Carry on.)
[ link to this | view in chronology ]
Re:
I don't know if it's true, but I heard that the Pepsi was labeled "M" and the Coke "Q", because they knew that in a blind test people are more likely to choose M when the two items are actually the same.
[ link to this | view in chronology ]
You need to not allow yourself to be stuck in the mental theory so often and look at the end result and what rights and individual has over themselves which supercede a right to profit.
[ link to this | view in chronology ]
That is *very* different than what Facebook did, when implemented an experiment that attempted to *effect different moods* in their users who were *expecting* the "usual Facebook experience". If I remember correctly, they concluded they DID modify (with statistical significance) the mood of several hundred people. Their own conclusion admits the impact they had on their test subjects, and yet they still act indifferent and as if even the very *thought* of running the ethics of their experiment by a 3rd party is some triviality that is beneath them.
For me, though, the damming thing is... the experiment WAS rather trivial. It almost certainly would have gotten approved, and they might have even gotten a waver for some of the usual before-the-experiment informed consent checks that might have invalidated the results. They could have even worked with the IRB at some existing university to setup some "web/social industry"-specific IRB that would be able to respond even faster to these kinds of experiments in the future.
No, instead they act like spoiled narcisist that cannot abide being critiqued by others. So even though the Maryland law is kind ofa problem, they deserve getting the book thrown at them if only as a splash of cold water to their ego. Maybe then they will start to understand that *yes*, you *do* need to get a 3rd party to check over your intentions *before* you start poking at people's emotions.
[ link to this | view in chronology ]
The Common Rule doesn't create a criminal penalty--indeed, if tried to do so it would be immediately struck down on First Amendment grounds. Rather, it compels action through the threat of removing Federal funds from the IHE.
Not an IHE, not a Federal grantee, it simply does not bind. And any attempt, like Maryland's to make it do so is flatly unconstitutional.
[ link to this | view in chronology ]
Old problem, already solved
[ link to this | view in chronology ]
[ link to this | view in chronology ]