The other day a friend of mine posted a link to a peer-reviewed scientific study concerning the effects of a vegetarian diet. He posted an excerpt from the paper’s abstract:
Our results revealed that a vegetarian diet is related to a lower BMI and less frequent alcohol consumption. Moreover, our results showed that a vegetarian diet is associated with poorer health (higher incidences of cancer, allergies, and mental health disorders), a higher need for health care, and poorer quality of life.
Before I even clicked the link, alarm bells were going off. Just in those two sentences, they list seven things measured. That’s not science, kids, that’s shooting dice in the alley. If you measure enough things about any group of people you’ll find something that looks interesting. Holy moly, I thought, how many things did this survey try to measure, anyway? (I believe the answer to that is eighteen.)
It’s possible that some of the correlations these guys found actually are significant, and not the result of random chance. It’s not possible to tell which ones they might be, as it’s almost certain that many of the conclusions are completely bogus.
And then there’s selection bias. I read elsewhere (link later) that in Austria, many vegetarians eat that way on Doctor’s orders, because they’re already sick. That will skew the numbers.
But the paper was peer-reviewed, right? I spent a little time trying to figure out who those peers might be, but there’s no sign of them I could find on the site where this paper is self-published. And, frankly, “peer-reviewed” doesn’t mean shit anymore. Peers are for sale all over the place. If you can’t see the credentials of the people who reviewed the work, it may as well not be peer-reviewed at all.
And none of the authors seem to have any credentials or degrees themselves. Perhaps they just didn’t feel compelled to mention them, but that strikes me as odd — especially for Europeans, who traditionally love to lay on the titles and highfalutin name decorations.
The site has 53 references to that article being mentioned in the media. Some of the places that quote this nonsense actually have “science” in their titles. Sigh. Apparently Science 2.0 is Science where you believe every press release that crosses your desk. Perhaps Muddled Ramblings and Half-Baked Ideas will make number 54 — although I suspect the keepers of PLOS ONE might not want this reference promoted. But to their credit they do show the link to an article in that Bastion of Science Outside Online, where at least one journalist took a sniff before pressing the “publish” button.
Outside Online, you do science better than Science 2.0. You have my admiration.
So is this research totally useless? Actually, no. It’s possible a grad student somewhere could find ONE of the claims made in the paper interesting enough to do REAL science to improve our understanding of nutrition and health. The study might be to test the hypothesis “a vegetarian diet increases the chances of lymphoma,” or something like that. A single question, while keeping the rest of the variables as controlled as possible in a human study (which is really tough).
That work would take years to accomplish and would not show up in The Guardian or probably even Outside Online. It would be a small brick in our edifice of understanding, a structure that has been growing for hundreds of years.
So when you read about “a study” that shows many things, look at it with squinty eyes and you’ll see behind it a group of people rolling the dice, and there’s often no telling who their master is. It’s not really a study at all, but a press release with numbers.
Yes, we know people who have switched to a vegetarian diet BECAUSE they had cancer. However, if it is true, you will live a long and cancer-free life!