Last year, we featured a story on our blog about the so-called cardiovascular benefits of fish oil, largely based on a seminal research study that had more to do with hearsay than with actual science. After your diet, flawed research is now trying to meddle with your sports life.
A Danish study published in the Journal of the American College of Cardiology recently made the headlines for suggesting that too much jogging could have a negative impact on life expectancy. In a recent post in the New York Times, economist Justin Wolfers carefully analyses the study and provides a brilliant response discrediting this overly-confident claim:
The researchers asked Danish runners about the speed, frequency and duration of their workouts, categorizing 878 of them as light, moderate or strenuous joggers. Ten years later, the researchers checked government records to see how many of them had died […] Happily, only 17 had. While this was good news for the surviving runners, it was bad news for the researchers, because 17 was clearly too few deaths to discern whether the risk of death was related to running intensity.
Nonetheless, the study claimed that too much jogging was associated with a higher mortality rate […] The evidentiary basis for this claim is weak. It is based on 40 people who were categorized as “strenuous joggers” — among whom only two died. That’s right: The conclusions that received so much attention were based on a grand total of two deaths among strenuous joggers. As Alex Hutchinson of Runner’s World wrote, “Thank goodness a third person didn’t die, or public health authorities would be banning jogging.”
Because the sample size was so small, this difference is not statistically significant. You may have heard the related phrase “absence of evidence does not equal evidence of absence,” and it is particularly relevant here […] In fact, the main thing the study shows is that small samples yield unreliable estimates that cannot be reliably discerned from the effects of chance.
Wolfers goes on highlighting other weaknesses in the Danish study. This new case of unreliable research finding receiving wide media coverage brings out a couple of important points that are central to our work at BITSS:
- Over-interpretation from inconclusive data has become common practice in order to make results look better than what they actually are.
- It’s not because a study is published that it can be taken for granted. Repeated cases of publication bias and weaknesses in peer-reviewing have cast doubt upon the credibility of published findings.
- Sloppy research with groundbreaking claims too often gets picked up by popular media without any due diligence or careful interpretation. Of course, this wouldn’t be the case if reviewers hadn’t let the study pass in the first place.
- It’s essential to look for potentially contradictory sources of evidence before making any scientific claim. In this regard, researchers should be incentivized to do more meta-analyses and replications of existing studies.
And Justin Wolfers to conclude, “scientific progress is slower and less spectacular than it often appears.”