You probably already knew this, but nutrition science is complicated. It doesn’t lend itself to three-paragraph clickbait articles very well. Although that doesn’t stop these types of headlines:
“Coconut oil found to kill cancer”
“Cucumbers relieve pain, study shows”
But how can you identify clickbait when it’s more subtle than these silly headlines? Like the misleading articles that emerge whenever a hot new study comes out? These three tips can help:
When some people find out about the magic of Pubmed, they use Pubmed links like weapons.
Unfortunately, that’s not the way that rational argumentation works. A study is just a potential source of data, not truth in and of itself. So a link to a study is only a link to potentially relevant knowledge. Or in some cases, potentially irrelevant or misleading knowledge.
For example, studies funded by soda manufacturers are much more likely to show neutral results for sugary or diet drinks. Studies funded by nonprofit entities are more likely to find soda harms. I highly doubt that the non-profit entities have a secret agenda to destroy any for-profit entity. Using Occam’s razor, the simplest explanation is that the company-funded studies have been designed or reported in such a way as to emphasize the good while de-emphasizing (or even sometimes ignoring!) the bad.
Randomized trials may be the “gold standard” of evidence, but many are riddled with methodological issues (like small sample sizes, or wacky statistical methods), and bias can exist from funding sources or the authors themselves.
Imagine that you’re a researcher who published a really cool study on vitamin D helping blood pressure. Then a couple years later, a study comes out that finds the opposite of what your study found. Their study is newer, and has at least temporarily relegated your study to the dustbin, in the eyes of the media and public.
As a nutrition enthusiast, you have to understand that reading the full text of one single study isn’t enough. You also have to know about other relevant studies, and know how to compare the details of these studies (biostatistics, patient characteristics, etc). Some seemingly contradictory studies may actually jive because of different study methods and samples, but some may not.
The scientific method relies on collecting many observations and iterating your hypothesis. So citing just one study, without knowing about other relevant ones, will invariably get you into trouble.
This cannot be repeated enough. Let’s say that you have a friend who SUPPOSEDLY developed a allergy to red meat a few years ago. But instead of it happening within minutes of eating the food, like a normal allergy, it happens hours after. Ridiculous! Preposterous! And there weren’t any studies on this supposed allergy when you looked it up … so it must be psychosomatic.
Wrong. Studies started being published in the last few years of red meat allergies caused by tick bites.
Humans are so sure of their limited cognitive abilities that they’ll disregard anything that doesn’t fit into their predefined mold. If someone reliably gets an allergic reaction after eating red meat, that’s a very valuable observation, and the starting point for further thought. It’s not something that has to be backed up by a study, because studies aren’t always available for what’s happening to you.
The body is infinitely more complex than we understand, and isolated trials will never be able to answer all the questions we want to know about. Your personal experiences are worth as much, and sometimes more, than the studies that are out there.
Clickbait in nutrition is extremely common. Even some major media outlets release clickbait articles on the regular, as readers’ attention spans go down and people hunt for quick health fixes in their news sources. Don’t fall for this. Learn about actual research from unbiased sources, with all the nuances that are involved. Details and complexities are hard, but worthwhile.