Subscribe to the OSS Weekly Newsletter!

Register for the OSS 25th Anniversary Event

Intermittent Fasting and the Perils of Fast Facts

Here's why we should be wary of recent headlines linking the popular dietary strategy with an increased risk of cardiovascular death.

This article was first published in The Montreal Gazette.

Recent headlines proclaimed that intermittent fasting can double your chances of dying from a heart attack. It’s not true though. Understanding why forces us to examine a fundamental feature of statistical randomness.

We also need to remember another important medical truth: Your zodiac sign doesn’t affect the effectiveness of Aspirin.

If you missed the story, a report from the recent American Heart Association Epidemiology and Prevention / Lifestyle conference suggested that intermittent fasting, otherwise known as time-restricted eating, was associated with an increase in cardiovascular mortality. The story dominated the news cycle for about two days before some pushback.

In brief, researchers analyzed two separate data sets to study the long-term implications of this increasingly popular dietary strategy. Information about eating patterns was collected from the National Health and Nutrition Examination Survey. As part of the survey, participants completed two separate food questionnaires about what they ate over the previous 24 hours. Deaths were recorded in the Centers for Disease Control and Prevention’s National Death Index database, and researchers had on average eight years worth of follow-up data on just over 20,000 participants.

Researchers examined how much time-restricted eating affected mortality risk. Those who ate their daily meals within an eight-hour window (that is, they fasted for more than 16 hours a day) had a higher risk than the control group, which ate during a 12- to 16-hour window (that is, they ate fairly consistently throughout the day). Comparing these two groups, the 16:8 dieters had a 91 per cent increased risk of cardiovascular death.

But here the problems started. Much of the early reporting was based on a press  release put out by the American Heart Association. There were minor discrepancies between the submitted abstract and the actual poster presentation that was eventually put online.

For those unfamiliar with the mechanics of medical research, poster presentations at conferences are deliberately brief snapshots of one’s research. They are more akin to the trailer of a movie rather than the final theatrical release. Since the data was not published in a medical journal, details were sparse and one should expect more edits and revisions before we see the final version.

Despite these surface-level objections to the preliminary and non-peer reviewed nature of the results, there is a more fundamental problem with the research. Although the 16:8 dieters did show a 91 per cent increased risk of cardiovascular death, the poster actually contained a multitude of analyses. Notably, there was no association between intermittent fasting and cancer mortality or with mortality overall.

Researchers also divided patients into several groups, and it was only those doing more than a 16-hour fast who had the increased risk. Of the 36 analyses in the poster, only one showed an increased risk. Paradoxically, those who did not fast at all (that is, they ate continuously for more than 16 hours a day and stopped seemingly only to sleep) had a lower risk of cancer mortality.

When examined objectively, the analyses in this poster showed no link between intermittent fasting and any of the outcomes. There were two outliers, one suggesting higher risk and another showing a lower risk for different outcomes in different groups. My interpretation is that this is random statistical noise.

A famous 1988 study, the ISIS-2 trial, demonstrated that taking Aspirin during a heart attack provided a cardiovascular benefit. But there was one group of patients who did not benefit: Geminis and Libras. The researchers included this spurious if humorous analysis to make a point. Perform enough statistical tests and you will eventually get a positive but completely random and meaningless result.

Many critiques can be levelled against this conference poster. It is not peer reviewed, did not adjust for the quality of participants’ diet, and relied on only two days of self-reported diet history. But more importantly, the lone positive association is not really groundbreaking. It’s likely an outlier. In research, as in life, some things are just random.


Back to top