I’ll admit this is predictable, but I can’t resist: It’s a tale of two news releases. Released on the same day, about the same study, but with very different headlines.

But first, the study: a randomized, placebo-controlled study run over four years by Creighton University, in collaboration with the University of California San Diego, with the objective of determining if dietary supplementation with vitamin D and calcium reduces the risk of cancer in older women. So what’s the answer?

Judging from the Creighton news release, vitamin D and calcium supplementation DID reduce the risk of cancer.

But according to the news release from the Journal of the American Medical Association (JAMA) — where the study was published this week — it DID NOT.

The Creighton news release highlights that in 2,303 healthy post-menopausal women over the age of 55, those that were randomly assigned to receive 2000 international units (IU) of vitamin D3 and 1500 mg of calcium, had a 30 percent lower risk of developing cancer than the placebo group. But buried in the third paragraph is this:

“This difference in cancer incidence rates between groups did not quite reach statistical significance.”

So why the “decreases risk of cancer” headline?

Clearly, these results are speculative until born out by other large, controlled studies like the pending VITAL study at Brigham and Women’s Hospital.

Fishing for significance?

The authors go on to point out that two post hoc observational studies were significant. However, post hoc observations — by definition — look at the data AFTER the experiment is concluded. They simply look for patterns, after the fact, that were not part of the original study design. That’s why some critics call them “fishing expeditions.” They are considered inconclusive in studies like this in which the research question is clearly defined beforehand.

Even the authors, in their discussion of the study, point out that the post hoc observations “should be considered only exploratory and hypothesis-generating, and require assessment in further studies.”

The first post hoc observations showed an inverse association between blood levels of vitamin D (specifically, 25 hydroxy vitamin D) and cancer in the supplement group. Problem is, this observational method is no longer truly randomized so drawing cause-and-effect conclusions is impossible.

The second post hoc analysis — which excluded cancers, deaths, and drop-outs during the first year — showed that 3.17 percent of the supplement group and 4.86 percent of the placebo group had a new cancer diagnosis during years 2-4 of the study. But this result can not unequivocally differentiate which agent — calcium or vitamin D — is responsible for that effect.

But back to the news releases because there are two important considerations to bring up here.

First, we don’t know who was responsible for the glowing headline coming from the Creighton media affairs department. Whether public relations officials or study authors pushed this misleading narrative is unclear. But co-author Cedric Garland certainly should be challenged on this quote:

“This is the most important scientific study of this century to date.”

Really? A study with no statistical significance? Is it possible that Dr. Garland’s grandiose claim is influenced by the fact that he — along with half of the authors of this study — are associated with GrassrootsHealth, a nonprofit whose primary focus seems to be promoting the therapeutic or protective value of vitamin D for a variety of conditions such as diabetes, Alzheimer’s disease, autism, cystic fibrosis, and premature births to name a few?

Second, news releases like the arguably self-serving one from Creighton are not benign. In a time crunch, which news release does a journalist choose? Maybe KMTV in Omaha went with the hometown Creighton release as suggested by their headline, “Creighton study finds vitamin D decreases risk of cancer” (although, contrary to the headline, KMTV eventually mentions the study was not statistically significant). Other news outlets like CBS news and WebMD were more critical and comprehensive.

What are the lessons here?

Lesson number one is that low levels of skepticism may be hazardous to your health. As Gary Schwitzer wrote in this recent BMJ article about what is or isn’t fake news, it’s probably good old-fashioned “spin” that is actually more ubiquitous, more manipulative, and maybe even more dangerous.

Lesson two is a variation on “measure twice and cut once” — which when applied to reading the news means going to two or more sources on stories you feel are important before rendering an opinion. This is a classic example of a story where just reading one source could have resulted in you getting the exactly wrong information.

The final lesson is a big one: special interests are everywhere. Readers might want to ask themselves these questions. Is it possible Creighton put a positive spin on these inconclusive results? If so, to whose benefit? Also, is it possible the authors of this study may have gone looking for an answer that suited their agenda? And when the results were inconclusive — which is important information — why not just share that openly?