But of course, that’s probably not true.

A new article published in the American Journal of Clinical Nutrition, by JD Schoenfeld and JP Ioannidis, examined the conclusions, statistical significance, and experimental reproducibility of published articles that claim an association between specific foods and the risk of cancer. The found 50 common food ingredients, taken from random recipes found in a typical cookbook. They then searched PubMed for studies that examined the relationship of each ingredient with a risk of cancer. (If they found a more than 10 articles for a particular search, the only evaluated the most recent 10 articles.) This study didn’t just examine increased risks but potential reduced risks of cancer.

According to Shoenfeld and Ioannidis, 40 out of the 50 ingredients had articles describing a relationship with cancer, which were published in 264 single-study assessments. Among the 40 foods that had been linked to cancer risks were flour, coffee, butter, olives, sugar, bread and salt, as well as peas, duck, tomatoes, lemon, onion, celery, carrot, parsley and lamb, together with more unusual ingredients, including lobster, tripe, veal, mace, cinnamon and mustard.

Tripe? No thanks.

Here are some of their findings:

Of the 264 articles, 73 showed no cancer risk, 103 concluded that there was an increased risk, and 88 a decreased risk.

About there quarters of the articles that showed some change in cancer risk had either weak or no statistical significance in the change. In other words, the results were barely different than randomly generated results!

Only 26% of meta-analyses, a method of statistical analysis that focuses on contrasting and combining results from different studies, in the hope of identifying patterns among study results, reported any change in cancer risk. So, in studies that try to identify trends for a particular food, only one quarter of them could find anything.

So what did they conclude?

Associations with cancer risk or benefits have been claimed for most food ingredients. Many single studies highlight implausibly large effects, even though evidence is weak. Effect sizes shrink in meta-analyses.

In an article in The Observer, Schoenfeld stated that “we found that, if we took one individual study that finds a link with cancer, it was very often difficult to repeat that in other studies. People need to know whether a study linking a food to cancer risk is backed up before jumping to conclusions.”

I wrote an article a few months ago debunking a link between bananas and preventing cancer, because the study didn’t actually write anything about bananas preventing cancer. But the study got out there in the internet as “definitive scientific proof.”

Basically, whenever you see some food causing or preventing cancer, treat that information with a huge amount of skepticism. It’s probably not supported with solid evidence.

Key citations

Like this: Like Loading...

Related