from the eat-it dept
Look, I probably don't have to tell you Techdirt readers this, but I'm a strange sort of cat. I could go into all the reasons why I'm odd, but whenever I try to explain to people how non-normal I am, I usually just reveal this little bit of truth: I hate chocolate. No, I don't not-love chocolate. Nor do I dislike chocolate. I fucking hate it, nearly as much as I hate how low I appeared on this ingenious bit of sleuthing a commenter did in determining which Techdirt writers swear the most (a list which I insist is fucking bullshit, by the way). That said, everyone else loves chocolate, of course, so I'm sure they and many others were thrilled to see so many well-respected publications blaring headlines recently about how chocolate can help reduce weight. I'd show you a bunch of links to those stories put forth by supposedly well-respected journalism outlets and scientific journals that make heavy claims about peer-reviews and fact-checking, but I can't because most of those stories have been pulled. Why?
Because the whole thing was a bullshit hoax put on by a journalist to make the point that, at least when it comes to studies around diet and health, the journals and the media the reports on their papers are largely full of crap. Go read that entire thing, because it's absolutely fascinating, but I'll happily give you the truncated version. John Bohannon, who has a Ph.D in molecular biology of bacteria and is also a journalist, conspired with a German reporter, Peter Onneken, to see how badly they could fool the media to create BS headlines. They did this by turning John Bohannon into Johannes Bohannon (obviously) and creating a website for The Institute of Diet and Health, which isn't actually a thing. Then they conducted a very real study with three groups: 1 group eating a low-carb diet, 1 group eating their regular diet, and 1 group eating a low-carb diet and a 1.5oz bar of dark chocolate daily. After running background on the groups, conducting blood tests to correct for disease and eating disorders, and hiring a German doctor and statistician to perform the study, away they went. The results?
Onneken then turned to his friend Alex Droste-Haars, a financial analyst, to crunch the numbers. One beer-fueled weekend later and... jackpot! Both of the treatment groups lost about 5 pounds over the course of the study, while the control group’s average body weight fluctuated up and down around zero. But the people on the low-carb diet plus chocolate? They lost weight 10 percent faster. Not only was that difference statistically significant, but the chocolate group had better cholesterol readings and higher scores on the well-being survey.
Bam, results! Not just results, but results the media would absolutely love to sink their idiotic teeth into. The problem? Well, the method for running the entire study was bullshit.
Here’s a dirty little science secret: If you measure a large number of things about a small number of people, you are almost guaranteed to get a “statistically significant” result. Our study included 18 different measurements—weight, cholesterol, sodium, blood protein levels, sleep quality, well-being, etc.—from 15 people. (One subject was dropped.) That study design is a recipe for false positives.
Think of the measurements as lottery tickets. Each one has a small chance of paying off in the form of a “significant” result that we can spin a story around and sell to the media. The more tickets you buy, the more likely you are to win. We didn’t know exactly what would pan out—the headline could have been that chocolate improves sleep or lowers blood pressure—but we knew our chances of getting at least one “statistically significant” result were pretty good.
Bohannon goes into some of the gory math, and it really is fun to read, but this is pretty easy to understand. With a small enough sample size and testing for as wide a range of results and factors as possible, you absolutely expect to find greater variance than if your study was testing for less factors or had a higher sample size. It's simple: people are different and testing less people makes those difference statistically appear to be more significant.
Anyway, the team then went to the International Archives of Medicine, which Bohannon identifies as a "fake journal" publisher. In other words, pay enough Euros and your "study" gets "published", all without the bothersome time-waster known as being peer-reviewed. Not that IAM doesn't
claim to be reviewed. It certainly does make that claim, but after payment was accepted Bohannon found that their study had been accepted without change. And, keep in mind, this study is designed
to be bad. So, once the study had been published, it was time for the PR machine to swing into action.
Take a look at the press release I cooked up. It has everything. In reporter lingo: a sexy lede, a clear nut graf, some punchy quotes, and a kicker. And there’s no need to even read the scientific paper because the key details are already boiled down. I took special care to keep it accurate. Rather than tricking journalists, the goal was to lure them with a completely typical press release about a research paper. (Of course, what’s missing is the number of subjects and the minuscule weight differences between the groups.) But a good press release isn’t enough. Reporters are also hungry for “art,” something pretty to show their readers. So Onneken and Löbl shot some promotional video clips and commissioned freelance artists to write an acoustic ballad and even a rap about chocolate and weight loss. (It turns out you can hire people on the internet to do nearly anything.)
Onneken wrote a German press release and reached out directly to German media outlets. The promise of an “exclusive” story is very tempting, even if it’s fake. Then he blasted the German press release out on wire service based in Austria, and the English one went out on NewsWire. There was no quality control. That was left to the reporters.
And it didn't take the reporters long to pick up this crap-on-a-stick and run with it like children after the ice cream truck. Not all of them, but some of the stories are still up. The Daily Star
covered their paper, for instance,
as did the Times of India, international editions of The Huffington Post, and some television news programs. Men's Health was going to go with a story in September, though that probably won't run now. Shape Magazine didn't get off so lucky, with their story appearing in the June issue, in print. And remember, this is all bullshit. None of it is real. How does something like this happen?
The answer is lazy "journalists."
When reporters contacted me at all, they asked perfunctory questions. “Why do you think chocolate accelerates weight loss? Do you have any advice for our readers?” Almost no one asked how many subjects we tested, and no one reported that number. Not a single reporter seems to have contacted an outside researcher. None are quoted. These publications, though many command large audiences, are not exactly paragons of journalistic virtue. So it’s not surprising that they would simply grab a bit of digital chum for the headline, harvest the pageviews, and move on. But even the supposedly rigorous outlets that picked the study up failed to spot the holes.
Now, there is some humor in all of this, but also danger. It's one thing to claim that chocolate leads to weight loss and have the media run wild with it, but we all know that fad diets and exciting health claims rain down on us in buckets, and I think it's safe to say that not all of them are as harmless as Bohannon's. The average person hasn't done much thinking about the validity of these studies that they read about in the media; they simply trust the media to do the fact-checking. The media, it appears, largely trusts the journals to do the reviews and fact-checking. Except some (many?) of those journals
don't. The whole thing harkens back to one of the funnier moments in the
Anchorman movie, when the main character makes a ludicrous claim about women's brains being smaller than men's, and then punctuates the statement with a smirk, saying, "It's science." As far as much of the media reporting goes, it might as well be "science."
Strangely, Bohannon notes that readers of the articles were apparently more skeptical than the authors.
There was one glint of hope in this tragicomedy. While the reporters just regurgitated our “findings,” many readers were thoughtful and skeptical. In the online comments, they posed questions that the reporters should have asked.
“Why are calories not counted on any of the individuals?” asked a reader on a bodybuilding forum. “The domain [for the Institute of Diet and Health web site] was registered at the beginning of March, and dozens of blogs and news magazines (see Google) spread this study without knowing what or who stands behind it,” said a reader beneath the story in Focus, one of Germany’s leading online magazines. Or as one prescient reader of the 4 April story in the Daily Express put it, “Every day is April Fool’s in nutrition.”
If we've reached a time when readers are more skeptical than the reporters, that's a massive problem for journalism, but perhaps a delightful sign for the spread of skepticism and inquiry amongst the public. Either way, look with a critical eye the next time you hear about that fad diet or health food claim.
Filed Under: chocolate, fooling journalists, hoax, journalism, science, statistics, studies, weight loss