Scientific research and dealing with sensationalism
In early 2017, the Internet was abuzz with headlines that boldly stated that a recent study showed that ‘Nutella causes cancer.’ Immediately after that were contrarian headlines stating “Don’t worry, Nutella does not cause cancer.”
So what was the truth? Neither.
The study had no direct link to Nutella. The flow of logic was thus:
- Eating too much GE is not healthy
- Refined palm oil has more GE than other oils
- Nutella has palm oil in it (but made at a low temperature, which minimizes GE)
- Thus, Nutella = causes cancer
What the actual study had done was investigate three compounds. One of them was Glycidyl Ester (GE), which palm oil has more of relative to other oils. Furthermore, the study had looked at animals, not humans. On top of that, it was looking at GE found in refined palm oil. And lastly, it had looked at the consumption of GE in very high rates (rare for adults).
If you want, you can read our full analysis on Does Nutella cause cancer?
This is not a thought exercise - this actually happened. The media made some incredible logic leaps to get to Nutella cause cancer.’
This is why we always read the study. It’s popular for gurus to read a quick PubMed abstract and decide they know what a study shows. But in our previous example, the abstract did not mention that no human data was used! It was only reading the entire 159 page paper that we saw this:
We do not outsource any of our analyses. And this is why we have a full team of different backgrounds to ensure that we get the full picture when we analyze any study.
Furthermore, this is why we constantly figure ourselves as Switzerland. It is not our job to debunk any specific claims. We do not target any specific writers or organizations. This is also why we do not call ourselves media - we are not pushing news onto our readers.
Our job is to simply look at the full body of research based on what our readers are asking us and give them a full assessment.
There are a few ways you can be better at dealing with clickbait headlines:
1. “Randomized trial” does not mean “infallible-truth”
Randomized clinical trials (RCTs) may be the “gold standard” of study design, but many are riddled with methodological issues (like small sample sizes, or wacky statistical methods) and bias, which can exist because of funding sources or the authors themselves (for example, studies funded by soda manufacturers are more likely to show neutral results for sugary or diet drinks).
Therefore, RCTs are a key ingredient, but not enough to establish certainty by themselves.
2. Studies do not exist in a vacuum.
The scientific method relies on collecting many observations and establishing your hypothesis or hypotheses. So, citing just one study, without fully exploring existing literature on the topic will invariably get you into trouble.
If one study finds X=A, and 13 studies find that X=B, it’s very likely that X=B.
RCTs are important pieces of the puzzle, but you have to add all of them up to see the full picture - not just look at one in isolation.
3. You do not need a study for everything
The body is infinitely more complex than we understand, and isolated trials will never be able to answer all the questions we want to know the answers to. Your personal experiences are often worth as much, and sometimes more, than the studies that are out there.
The placebo effect is real and also cannot be ignored.
At the end of the day, your health is dependent on you. We take an evidence-based approach that sums up the general truth, and always ensure that we look at the full body of research.