In 2015, gastroenterologist Edwin Liu set to work on a clinical and genetic data set that had been growing for more than 20 years. The data pertained to celiac disease, a lifelong condition involving bouts of severe gastrointestinal distress and other symptoms, triggered by ingestion of gluten proteins that are found in wheat and several other grains. In a two-decade collaboration with researchers at Children’s Hospital Colorado in Denver, Liu’s predecessors and colleagues at the University of Colorado kept track of 1,339 babies born in the city who were deemed at risk of developing the disease due to mutations in celiac-linked genes. The researchers carried out yearly tests to see whether or not the children developed the disease, hoping to better define the risk associated with each of the genetic variants.

Researchers reading the paper, which was published online earlier this year in Gastroenterology,1 were similarly taken aback. “If you look at the rates, it’s frightening,” says Joseph Murray, a celiac researcher at the Mayo Clinic in Rochester, Minnesota. Of course, the statistic could be specific to the Denver cohort, he notes, but it does fit in with similar trends reported both in the U.S. and around the world.

Celiac symptoms, which include abdominal pain and distension, diarrhea and flatulence, nausea, and fatigue, are brought on by ingestion of gluten—a protein complex present in wheat, barley, and rye. Unlike food allergies, which are often primarily mediated by an overreaction of adaptive immune responses such as immunoglobulin E antibody production and mast cell activation, celiac disease engages both innate and adaptive immune pathways, and produces antibodies that target not only gluten, but the body’s own proteins. As a result, the disease is generally considered an autoimmune condition. (See illustration.) Triggered by even tiny amounts of gluten, these immunological attacks lead to T cell–mediated atrophy of the gut wall, which can be characterized via a biopsy of the small intestine for celiac diagnosis (see “Diagnosing Celiac Disease”).

As the use of biopsy and other diagnostic methods have improved in recent decades, celiac disease has become easier to detect. So when the first reports of increasing numbers of celiac cases in the U.S. came out in the early 2000s, many researchers attributed the uptick to progress in disease recognition. But closer scrutiny of the data suggested there was more going on. “We weren’t just better at finding celiac disease,” Murray says. “There was a lot more of it to go around.”

Around 40 percent of people have the genes predisposing them to celiac disease. The big question is why some people get it and others don’t.—Edwin Liu,

University of Colorado

By comparing blood samples taken from young adults in the Air Force around 1950 with matched samples from residents of a Minnesota county collected since 1995, for example, Murray’s group estimated an increase in prevalence from 0.2 percent to nearly 1 percent.2 Sweden, meanwhile, experienced what is now referred to as a celiac epidemic in the late 1980s and early 1990s, with one study estimating that as many as 3 percent of children born at the height of the epidemic had developed celiac disease by the age of 12—though rates dropped back down to just over 2 percent for children born in 1997.3 And several studies based on blood tests suggest increasing numbers of people are developing celiac disease in wheat-eating areas of northern India, with a prevalence in children of around 1 percent and some researchers warning of an impending epidemic there too.

The cause of this apparently global trend remains a mystery, not least because, while the immunopathology of celiac disease has been studied for decades, just what causes people to develop the ailment in the first place remains unclear. Almost all diagnosed patients have mutations in at least one of the two genes coding for HLA-DQ, a membrane receptor on antigen-presenting cells that helps the immune system distinguish self from non-self and coordinate T-cell activity. But not everyone who has such risk genes gets celiac. “Around 40 percent of people have the genes predisposing them to celiac disease,” Liu explains. “The big question is why some people get it and others don’t.” Hypotheses abound, with many pointing the finger at a gluten-rich diet, but evidence to support these ideas remains far from conclusive.

Getting to the bottom of this question will be necessary not only to curb the concerning trend, but also to help doctors better detect and manage the multifarious disease, for which the only current treatment is a gluten-free diet. In addition to celiac’s sometimes-debilitating symptoms, the disease is associated with a heightened risk for numerous conditions, including autoimmune diseases such as diabetes and hypothyroidism, and myriad other disorders, from infertility to small-bowel cancer. Overall, celiac patients have up to a twofold increased mortality risk compared with the general population.

“The stakes are high,” says Murray. “If this disease has gone from being a truly rare disease in some geographies to being a common disease affecting 2 or 3 percent of children, that’s no longer a small disorder.”

Why the rise?

© ISTOCK.COM/MACROVECTOROne thing celiac researchers agree on is that the direct cause of the rise in the disease likely resides outside of our DNA. “Over decades, it’s just too quick for genetic changes to occur,” Liu says. “We have to assume that this is based on environmental factors.” There’s still little in the way of concrete answers as to what these factors might be, however. “I’ve heard every type of hypothesis that’s been thrown out there,” says Murray, “but most are them are not easily testable.”

Some of the more unusual candidates blamed for triggering celiac disease include microwaves, plasticware, and diatomaceous earth—an abrasive powder applied to flour containers as an insecticide—although scientific evidence to incriminate these supposed culprits is scant. Other factors that have more reliably been tied to increased celiac risk in genetically predisposed infants include delivery by Caesarian section, and intestinal infections by pathogens such as reovirus (recently implicated in a mouse model)4—although their impacts are likely minor, Murray says.

The role of gluten itself—the immediate trigger for the immune responses in celiac patients and therefore, researchers have long assumed, a crucial player in the epidemiology of the disease—has also remained frustratingly elusive. During the 2000s, for example, several observational studies pointed to a suite of dietary factors, including age of gluten introduction, as influencing the development of celiac disease. But the findings suffered a blow in 2014 when two randomized clinical trials failed to find any effect of the timing of gluten introduction.5,6 The studies also found no evidence for a link with the duration of breast-feeding—a factor that had previously been touted as protective against developing celiac disease. Gastroenterologist Alessio Fasano of Massachusetts General Hospital, a coauthor on one of the publications, says researchers realized then that “the story is much more complex than we thought.”

A more recent hypothesis is that the amount of gluten consumed, if not the timing, could play a role in triggering celiac disease in children. The US Food and Drug Administration (FDA) notes that wheat consumption increased rapidly in the second part of the 20th century as people began to eat less meat and consume increasing amounts of readily available, wheat-containing fast foods. (A more controversial idea is that the composition of wheat has changed significantly during this time—search “Frankenwheat.”) And rising incidence of celiac disease in South Asia tracks with the widening adoption of Westernized diets, although data on gluten consumption per se is lacking.

Some evidence that these dietary changes could be tied to the rise in celiac disease comes from a retrospective 2016 study of Swedish infants, which suggested that genetically susceptible children consuming more than 5 grams of gluten per day—the equivalent of about one slice of whole wheat bread—before 2 years of age were up to two times more likely to develop celiac disease than those consuming less than that amount.7 “[The result] tells me that the amount of gluten matters,” says Murray. “I think we have to go back and revisit what’s happening with gluten—how much are we eating, and is it a potential risk factor?”

Reactions to these findings have been mixed, however. “The evidence was fairly weak,” notes celiac researcher Detlef Schuppan of Johannes Gutenberg University Mainz in Germany. In terms of the global rise in celiac prevalence, “the amount of gluten ingested does not explain it,” he adds.

© JULIA MOORE

Amidst uncertainty about gluten’s part in the celiac trend, many researchers are quick to point out that it’s not just our diet that has changed in the last century. One factor now under scrutiny across digestive diseases and beyond is humans’ usage of antibiotics and, consequently, the composition of bacteria making up the gut microbiome. Bacteria living in the gut play important roles in metabolism and in the regulation of immune responses to food, so for many researchers, these microbes are likely suspects in celiac disease pathogenesis. According to this line of thinking, “maybe the bugs we’ve now got are not as happy when they interact with gluten,” Murray says. “Or the results are not as good for us when these bugs interact with gluten as when our old bugs did.”

See “The Sum of Our Parts”

The last two decades have seen a number of observational studies report abnormal microbiome composition in the guts of celiac patients compared with healthy controls. Patients with celiac disease show a higher proportion of gram-negative bacteria such as Bacteroides and E. coli, for example, and some evidence suggests that those displaying gastrointestinal symptoms also have higher levels of Proteobacteria. Another bacterium, Helicobacter pylori, has been associated with protection from celiac disease, and declines in the number of adults carrying this microbe in their guts appear to have coincided with increases in the number of celiac cases in the U.S.—although research on this subject remains inconclusive.

Whether these differences in microbiome composition are the cause or the consequence of celiac disease remains unanswered, says Bana Jabri, director of research at the University of Chicago Celiac Disease Center. “You could think about it in several ways,” she explains. “Maybe there’s a difference in the microbiota from the beginning, and this has a causative role. Or maybe what you’re seeing is just a secondary effect. At this point, we really don’t know.”

Nevertheless, circumstantial evidence is accumulating to suggest more than a passive role for these microbes. For starters, it’s known that changes in the microbiota can induce different types of immune responses, and celiac patients often continue to show abnormal gut flora even after adopting a gluten-free diet. Additionally, Jabri’s group showed last year that mice engineered to overexpress interleukin-15—a cytokine involved in celiac disease pathogenesis—had restructured microbiota as well as altered production of certain fatty acids, mirroring precursors of intestinal inflammatory diseases in humans.8 “When you put all this together, you could say that there really is enough evidence to believe in a causative role for the microbiota,” says Jabri. “But the critical experiments still need to be done.”

With so many factors being investigated, Liu says, it’s unlikely the explanation for an increase in celiac disease incidence will be simple. “I don’t think we’re going to be able to find a single environmental trigger,” he says. “It’s going to be a combination.” Murray takes a similar view. “There are so many things going on, so many moving parts,” he says. “The challenge for us as scientists is to reduce it down to testable hypotheses.”

Damage control

© ISTOCK.COM/MACROVECTORWhile scientists grapple with how to explain the disease’s underlying causes, the rise in celiac prevalence is prioritizing the condition in the medical community and highlighting the need for improved diagnosis and management of existing cases. It’s worth remembering, says Murray, that celiac disease is a lifelong condition. “When you develop celiac disease, you can’t undevelop it,” he says. “You can heal it, by avoiding gluten, but you can’t put Humpty Dumpty together again. The immune system has changed.”

For now, the only treatment is a gluten-free diet. But as the focus on celiac disease has intensified, so too has research on the effects of this intervention—and the results are not encouraging. Several studies suggest that celiac patients’ guts are unlikely to heal completely even on such a diet, and accidental ingestion of the ubiquitous protein is almost inevitable; even tiny amounts can trigger symptoms. Moreover, patients find the gluten-free regimen difficult to tolerate because it is isolating and often impractical. In 2011, University of Sheffield celiac researcher David Sanders and his team surveyed 310 people diagnosed with celiac disease who were following a gluten-free diet, and found that more than 40 percent were dissatisfied.9 “Most people eat three times a day, and with celiac disease, that’s a challenge,” says Sanders. “Every single day, this problem is right in front of [them].”

To improve outcomes for patients with celiac disease, some scientists are exploring ways to tackle the celiac gut’s response to gluten and potentially restore tolerance. Celiac researcher and chief scientific officer of ImmusanT, Bob Anderson, for example, is expanding work he began during a postdoc at the University of Oxford to develop a vaccine that could help desensitize patients to gluten over a series of injections. The vaccine, Nexvax2, comprises three peptides—components of gluten taken from wheat, barley, and rye—that trigger an adaptive immune response in celiac patients. Phase 1 results showed the vaccine to be safe, and Phase 2 trials are planned for later this year, Anderson says. “It’s a highly targeted approach to try to engage the bulk of the gluten-reactive T cells that play a pivotal role in causing and maintaining the disease,” he explains.

There are less orthodox approaches in the works, too. In recent years, researchers in Australia have been investigating the idea that attacking the body with intestinal parasites could help mitigate the effects of autoimmune disease by “giving the immune system something to do,” says Sanders, who was not involved in the work. In 2014, the team reported results from 12 people with celiac disease who, after being experimentally infected with hookworm larvae—which then migrated to the gut and grew—could follow a normal diet with significantly reduced symptoms.10 The same researchers plan to launch a follow-up, double-blind randomized trial with 60 participants later this year, and have stated that the results could inform the future development of new, non-parasite–based therapies.

Other experimental therapeutics for celiac disease would require patients to continue a gluten-free diet, but serve to reduce the severity of reactions to accidental exposure when taken before a meal. Baltimore-based pharmaceutical company Alba Therapeutics—cofounded by Fasano—has developed pills containing a synthetic peptide known as larazotide acetate, which inhibits a protein called zonulin that regulates epithelial-cell tight junctions. Zonulin is thought to increase the permeability of the gut, and is upregulated in people with celiac disease—factors that company researchers suggested could explain why the drug reduces gastrointestinal symptoms in celiac patients following gluten exposure. The therapy was licensed out for Phase 3 trials last year. Another experimental drug, developed by California-based Alvine Therapeutics and recently acquired by Immu­nogenX, contains a mixture of two gluten-targeting proteases—one plant-derived and one derived from bacteria. The drug chops gluten molecules into successively smaller pieces, and reduced gut tissue damage in celiac patients who ingested small amounts of the protein in a Phase 2 trial.

© JULIA MOORE

With these alternative treatments a long way from being available for general use, many health-care providers argue that improved diagnosis should be the priority, to help better understand and care for the increasing numbers of people living with celiac disease. In 2014, a team at the University of Nottingham reported significant progress in this area, estimating the rate of celiac diagnosis to have increased dramatically in the U.K. over the last two decades, with nearly one in four sufferers now being diagnosed.11 “People in this field were clapping themselves on the back when that study came out,” says Sanders. “I think I clapped myself on the back too. But when I sat down and thought about it, I thought, that still means 75 percent of cases are undiagnosed. And that’s absurd, isn’t it?”

Part of the delay in improvements has to do with medical practitioners “playing catch-up” with an expanding range of symptoms recognized as warning signs, Sanders adds. While traditionally linked to gastrointestinal symptoms such as diarrhea in children, celiac disease is increasingly being identified in older patients, many of whom don’t display telltale gastrointestinal symptoms but have conditions previously deemed unrelated, such as anemia and osteoporosis. “The types of presentations have changed,” agrees Murray, who published accounts of these nonclassic symptoms in US patients in the early 2000s. “It’s turned on its head the preconceived notion of what the disease should be.”

Sanders also points to reticence among physicians about diagnosing the condition—a situation that, somewhat paradoxically, hasn’t been helped by the increasing popularity of gluten-free diets and publicity surrounding the “spectrum” of gluten-related disorders. (See “Grains of Truth.”) “There’s a lot of nihilism towards this condition, and I think it’s got a lot to do with the [treatment being a] gluten-free diet,” Sanders says. “If this was a tablet, nobody would argue, and physicians would say, ‘Oh, you have to take tablet X.’ But the word ‘diet’ has all sorts of connotations. . . . Even the medical fraternity can have quite derogatory views [of diet-based treatments].”

There’s a lot of nihilism towards this condition, and I think it’s got a lot to do with the treatment being a gluten-free diet. Even the medical fraternity can have quite derogatory views of diet-based treatments.—David Sanders, University of Sheffield

There’s related concern that public attention to gluten could undermine how celiac disease is detected, as people self-prescribe a gluten-free diet even without being diagnosed by a doctor. “Now, the latest trend is people avoiding gluten without having a diagnosis at all,” explains Murray. “I think that when it comes to counting and measuring and knowing what’s happening in the world with celiac disease, it’s going to be very hard to see that with this gluten-free train.” Last year, for example, researchers at Rutgers University highlighted an apparent stabilization in celiac diagnoses since 2009, but also noted a soaring number of people following gluten-free diets in the U.S. without having ruled out celiac disease, making it impossible to determine the true fluctuation in celiac prevalence.12

To get a stronger grip on the global prevalence of celiac disease and better help those who are diagnosed, researchers are keen to discourage people from self-treating for gluten-related disorders. For all ailments thought to be related to gluten consumption, “there needs to be a greater awareness of science as a discipline to examine these questions,” says Murray. “It doesn’t matter how high you pile anecdotes, that does not science make.”