Cheerios are the best-selling breakfast cereal in America. The multi-grain version contains 18 milligrams of iron per serving, according to the label. Like almost any refined food made with wheat flour, it is fortified with iron. As it happens, there’s not a ton of oversight in the fortification process. One study measured the actual iron content of 29 breakfast cereals, and found that 21 contained 120 percent of the label value, and 8 contained 150 percent or more.1 One contained nearly 200 percent of the label value.

If your bowl of cereal actually contains 20 percent more iron than advertised, that’s about 22 mg. A safe assumption is that people tend to consume at least two serving sizes at a time.1 That gets us to 44 mg. The recommended daily allowance of iron is 8 mg for men and 18 mg for pre-menopausal women. The tolerable upper intake—which is the maximum daily intake thought to be safe by the National Institutes of Health—is 45 mg for adults.

Natalia Ganelin / Shutterstock

It is entirely feasible that an average citizen could get awfully close to exceeding the maximum daily iron intake regarded as safe with a single bowl of what is supposed to be a pretty healthy whole-grain breakfast option.



And that’s just breakfast.

At the same time that our iron consumption has grown to the borders of safety, we are beginning to understand that elevated iron levels are associated with everything from cancer to heart disease. Christina Ellervik, a research scientist at Boston Children’s Hospital who studies the connection between iron and diabetes, puts it this way: “Where we are with iron now is like where we were with cholesterol 40 years ago.”





The story of energy metabolism—the basic engine of life at the cellular level—is one of electrons flowing much like water flows from mountains to the sea. Our cells can make use of this flow by regulating how these electrons travel, and by harvesting energy from them as they do so. The whole set-up is really not so unlike a hydroelectric dam.

The sea toward which these electrons flow is oxygen, and for most of life on earth, iron is the river. (Octopuses are strange outliers here—they use copper instead of iron, which makes their blood greenish-blue rather than red). Oxygen is hungry for electrons, making it an ideal destination. The proteins that facilitate the delivery contain tiny cores of iron, which manage the handling of the electrons as they are shuttled toward oxygen.

This is why iron and oxygen are both essential for life. There is a dark side to this cellular idyll, though.

Oxygen and iron are essential for the production of energy, but may also conspire to destroy the delicate order of our cells.

Normal energy metabolism in cells produces low levels of toxic byproducts. One of these byproducts is a derivative of oxygen called superoxide. Luckily, cells contain several enzymes that clean up most of this leaked superoxide almost immediately. They do so by converting it into another intermediary called hydrogen peroxide, which you might have in your medicine cabinet for treating nicks and scrapes. The hydrogen peroxide is then detoxified into water and oxygen.



Things can go awry if either superoxide or hydrogen peroxide happen to meet some iron on the way to detoxification. What then happens is a set of chemical reactions (described by Haber-Weiss chemistry and Fenton chemistry) that produce a potent and reactive oxygen derivative known as the hydroxyl radical. This radical—also called a free radical—wreaks havoc on biological molecules everywhere. As the chemists Barry Halliwell and John Gutteridge—who wrote the book on iron biochemistry—put it, “the reactivity of the hydroxyl radicals is so great that, if they are formed in living systems, they will react immediately with whatever biological molecule is in their vicinity, producing secondary radicals of variable reactivity.”2

Such is the Faustian bargain that has been struck by life on this planet. Oxygen and iron are essential for the production of energy, but may also conspire to destroy the delicate order of our cells. As the neuroscientist J.R. Connor has said, “life was designed to exist at the very interface between iron sufficiency and deficiency.”3

Remarkable as it may seem for a substance so elementary, our understanding of human iron metabolism is relatively young. Many of the most fundamental components of iron physiology were worked out by members of a generation recent enough to include my own grandfather. But the very basics, at least, were uncovered in the early moments of modern science, the 18th and early 19th centuries. In 1764, Vincenzo Menghini described in wonderfully expressive language a series of experiments whereby he “finally discovered that the red globules themselves are the seat of iron, which I had been searching for high and low with wearisome and daily fatigue in all other parts of the animal.”* Menghini was largely correct in declaring red blood cells to be the seat of iron in the human body, for these cells do indeed contain the vast majority of total body iron. Within red cells the vessel for this iron is a protein called hemoglobin, itself discovered in 1840 in Leipzig. It would be another century before hemoglobin’s molecular structure was fully elucidated (by an astounding and wondrous technique known as x-ray crystallography); its secrets would come to be illustrative of the way in which the body manages the power of the iron atom. Hemoglobin is composed of four subunits, which are themselves in turn composed of a single protein chain bound to a chemical entity known as heme. The essence of heme is that of a fantastically dynamic cage, consisting of a planar honeycomb of cyclic carbon compounds surrounding a ring of four nitrogen atoms, forming amongst themselves a sanctum at the core of the structure. It is here that a single atom of iron stands, tethered in place and brought to heel by its four sentinel handlers like some ancient creature upon a stadium floor. When hemoglobin is circulated to an environment rich with molecular oxygen, like the capillary beds of the lungs, the molecule facilitates a gentler sort of interaction between the iron and oxygen atoms than that which occurs in radical formation. As it exists in heme, the iron atom no longer has license to release electrons fully to oxygen. Rather, the two elements share their electrons and are held together by this incomplete transaction, like two marathoners with one hand each upon the baton. This dynamic is exploited in a brilliant way to allow our red cells to distribute oxygen throughout the body. Hemoglobin is very sensitive to its milieu, and when it arrives in the periphery of the body’s circulation it undergoes subtle conformational changes which disrupt this bond and so jettison its cargo of oxygen to the tissues, before returning to the lungs to complete its circuit once again. The theme of a protective protein cage containing kernels of iron recurs among the various key elements of human iron metabolism. It is by these sophisticated molecular means that iron is safely ferried throughout the body and deployed, from the liver where much iron is stored, to the bone marrow where hemoglobin is made, to the outlying tissues where iron is utilized for energy production. Two other prominent members of this symphonic chemistry are called transferrin and ferritin, and they are especially pertinent to our discussion because they are among the primary metrics by which physicians and researchers measure iron status in the body. If hemoglobin can be likened to a small cage, then ferritin is a fortress. It’s a huge molecule, composed of 28 protein subunits and capable of housing 4,500 iron atoms. Nearly every cell in the human body makes it, along with those of every other animal on Earth, most plants, most bacteria, and even algae. It’s the principle storage depot for iron, keeping it safely sequestered so as not to raise Fenton-chemistry hell. Ferritin wasn’t discovered until 1937, but it is now understood to be one of the most fundamental adaptations that life has yet acquired to manage the balance it long ago struck between iron and oxygen. The protein is easily measured in the blood and provides a rough correlate of total body iron stores. Transferrin was identified several years later, with a serendipity on par with Fleming’s penicillium mold and his subsequent discovery of antibiotics. In 1944, two scientists named Schade and Caroline set out to investigate how the embryo within a bird’s egg manages to avoid being overrun by the bacteria and fungi with which a bird’s nest teems. They suspected that there must be some substance contained in the stuff of the egg itself that provided a sort of protective barrier. The scientists managed to isolate a component of the white that was indeed able to inhibit the growth of bacteria. Further efforts demonstrated that it did so by decreasing the amount of iron available to the pathogens, without which they could not grow. Two years later, the scientists identified a similar protein in human serum. It became clear from subsequent work that the primary function of this protein was not as an anti-microbial, but rather as the chief means by which iron is transported throughout the body. They named it transferrin. Transferrin is more compact than ferritin, consisting of a single polypeptide chain that involutes upon itself in a series of helices and sheets to form two very specific binding domains for iron. Transferrin binds iron extraordinarily tightly while it travels through the blood and cellular milieu, preventing any untoward reactions. When a cell is in need of a little iron, it sends transferrin receptors to its surface membrane. The transferrin-iron complex docks with the receptor, and is brought into the cell by a process known as endocytosis. This means that a little package of surface membrane pinches off and creates a membranous compartment inside the cell for the transferrin-iron-receptor unit. Hydrogen ions are pumped into the compartment, lowering the pH and loosening transferrin’s hold on its cargo of iron. What happens next remains nearly a complete mystery despite more than a half-century of research. Once the iron is liberated from transferrin, it is thought to exist in a “free” state, meaning that it is not bound to any known protective carrier protein. No one quite understands how this free iron then traverses the interior of the cell, or whether it may be chaperoned by some unknown shepherd, as it makes its way to the mitochondria or other organelles in the cell. This pool of intra-cellular free iron is a prime suspect for where iron might have opportunity to wreak havoc were it to come across a molecule or two of oxygen. *Sheftel, A.D., Mason, A.B., & Ponka, P. The long history of iron in the Universe and in health and disease. Biochimica et Biophysica Acta 1820, 161-87 (2012).





At the end of the 20th century, the metabolism of iron in the human body was still a bit of a mystery. Scientists knew of only two ways that the body could excrete iron—bleeding, and the routine sloughing of skin and gastrointestinal cells. But these processes amount to only a few milligrams per day. That meant that the body must have some way to tightly regulate iron absorption from the diet. In 2000 a major breakthrough was announced—a protein was found that functioned as the master regulator for iron. The system, as so many biological systems are, is perfectly elegant. When iron levels are sufficient, the protein, called hepcidin, is secreted into the blood by the liver. It then signals to gastrointestinal cells to decrease their absorption of iron, and for other cells around the body to sequester their iron into ferritin, a protein that stores iron. When iron levels are low, blood levels of hepcidin fall, and intestinal cells begin absorbing iron again. Hepcidin has since become recognized as the principal governor of iron homeostasis in the human body.

But if hepcidin so masterfully regulates absorption of iron from the diet to match the body’s needs, is it possible for anyone to absorb too much iron?

In 1996, a team of scientists announced that they had discovered the gene responsible for hereditary hemochromatosis, a disorder causing the body to absorb too much iron. They called it HFE. Subsequent work revealed that the product of the HFE gene was instrumental in regulating hepcidin. People with a heritable mutation in this gene effectively have a gross handicap in the entire regulatory apparatus that hepcidin coordinates.

This, then, leaves open the possibility that some of us could in fact take in more iron than the body is able to handle. But how common are these mutations? Common enough to matter for even a minority of people reading these words?

concept w / Shutterstock

Surprisingly, the answer is yes. The prevalence of hereditary hemochromatosis, in which two defective copies of the HFE gene are present and there are clinical signs of iron overload, is actually pretty high—as many as 1 in 200 in the United States. And perhaps 1 in 40 may have two defective HFE genes without overt hemochromatosis.4 That’s more than 8 million Americans who could have a significant short-circuit in their ability to regulate iron absorption and metabolism.



What if you have only one defective HFE gene, and one perfectly normal gene? This is called heterozygosity. We would expect to find more people in this situation than the homozygotes, or those with two bad copies of the gene. And in fact we do. Current estimates suggest that more than 30 percent of the U.S. population could be heterozygotes with one dysfunctional HFE gene.4 That’s pretty close to 100 million people.

Does this matter? Or is one good gene enough? There isn’t much research, but so far the evidence suggests that some heterozygotes do have impaired iron metabolism. Studies have shown that HFE heterozygotes seem to have modest elevations of ferritin as well as transferrin, a protein which chaperones iron through the blood, which would indicate elevated levels of iron.5,6 And a study published in 2001 concluded that HFE heterozygotes may have up to a fourfold increased risk of developing iron overload.4

A host of research articles have supported an association between iron and cancer.

Perhaps more concerning is that these heterozygotes have also been shown to be at increased risk for several chronic diseases, like heart disease and stroke. One study found that heterozygotes who smoked had a 3.5 times greater risk of cardiovascular disease than controls, while another found that heterozygosity alone significantly increased the risk of heart attack and stroke.7,8 A third study found that heterozygosity increased nearly sixfold the risk of cardiomyopathy, which can lead to heart failure.9



The connection between excessive iron and cardiovascular disease may extend beyond HFE heterozygotes. A recent meta-analysis identified 55 studies of this connection that were rigorous enough to meet their inclusion criteria. Out of 55 studies, 27 supported a positive relationship between iron and cardiovascular disease (more iron equals more disease), 20 found no significant relationship, and 8 found a negative relationship (more iron equals less disease).10

A few highlights: a Scandinavian study compared men who suffered a heart attack to men who didn’t, and found that elevated ferritin levels conferred a two- to threefold increase in heart attack risk. Another found that having a high ferritin level made a heart attack five times more likely than having a normal level. A larger study of 2,000 Finnish men found that an elevated ferritin level increased the risk of heart attack twofold, and that every 1 percent increase in ferritin level conferred a further 4 percent increase in that risk. The only other risk factor found to be stronger than ferritin in this study was smoking.

Ferritin isn’t a perfect marker of iron status, though, because it can also be affected by anything that causes inflammation. To address this problem a team of Canadian researchers directly compared blood iron levels to heart attack risk, and found that higher levels conferred a twofold increased risk in men and a fivefold increased risk in women.





If cardiovascular disease is one point in iron’s web of disease, diabetes may be another. The first hint of a relationship between iron and diabetes came in the late 1980s, when researchers discovered that patients receiving regular blood transfusions (which contain quite a bit of iron) were at significantly increased risk of diabetes. In hemochromatosis, there had been no way to know if the associated disturbance in glucose metabolism was due to the accumulation of iron itself, or to the underlying genetic defect. This new link between frequent transfusions and diabetes was indirect evidence that the iron itself may be the cause.

The next step was to mine existing data for associations between markers of iron status and diabetes. The first study to do so came out of Finland in 1997: Among 1,000 randomly selected Scandinavian men, ferritin emerged as a strong predictor of dysfunctional glucose metabolism, second only to body mass index as a risk factor.11 In 1999, researchers found that an elevated ferritin level increased the odds of having diabetes fivefold in men and nearly fourfold in women—similar in magnitude to the association between obesity and diabetes.12 Five years later, another study found that elevated ferritin roughly doubled the risk for metabolic syndrome, a condition that often leads to diabetes, hypertension, liver disease, and cardiovascular disease.13

Also in Health Take Light, Not Drugs By Katherine Hobson For Ryan Sherman*, a 34-year-old lawyer, something changed eight years ago when he moved from Texas to Europe and then to Boston and New York City: The shorter winter days of the more northern latitudes were like a dead weight...READ MORE

Christina Ellervik’s first contribution to the field came in 2011, with a study investigating the association between increased transferrin saturation—a measure of how much iron is loaded onto the transferrin protein, which moves iron through the blood—and diabetes risk.14 Ellervik found that within a sample of nearly 35,000 Danes, transferrin saturation greater than 50 percent conferred a two- to threefold increased risk of diabetes. She also identified an increase in mortality rates with transferrin saturation greater than 50 percent.

In 2015, she led another study that found that, among a sample of 6,000 people, those whose ferritin levels were in the highest 20 percent had 4 times greater odds of diabetes than those with ferritin levels in the lowest 20 percent.15 Blood glucose levels, blood insulin levels, and insulin sensitivity all were raised with higher ferritin levels.

“It’s incredible that there is so much promising literature, and nobody—nobody—is doing the clinical trials.”

There’s a problem here, though. All of these studies show associations. They show that two things tend to happen together. But they don’t tell us anything about causality. To learn something about causality, you need an intervention. In the case of iron, you’d need to lower the iron and then watch what happens. Fortunately, there’s a very easy and very safe intervention to lower iron levels that’s performed millions of times every year—phlebotomy, also known as blood donation.



One of the first studies to use phlebotomy to examine the relationship between iron and diabetes was published in 1998.16 The authors found that among both healthy and diabetic subjects, phlebotomy improved insulin sensitivity and glucose metabolism. A 2005 study found that regular blood donors exhibited lower iron stores and significantly greater insulin sensitivity than non-donors.17 In 2012, researchers phlebotomized pre-diabetic volunteers until their ferritin levels dropped significantly, and found a marked subsequent improvement in their insulin sensitivity.18 In that same year, a different group of scientists studied the effect of phlebotomy on several elements of metabolic syndrome, including glucose metabolism. They found that a single phlebotomy session was associated with improvement in blood pressure, fasting glucose, hemoglobin A1C (a marker for average glucose levels), and blood cholesterol six weeks later.19

Many caveats apply to this evidence—the line between correlation and causation remains unclear, some of the studies used relatively small sample sizes, and phlebotomy may cause other changes in addition to lowering iron. But taken together, the data lends weight to the idea that iron plays a significant role in the tortuous pathophysiology of diabetes.

As more published data began to suggest a relationship between iron, cardiovascular disease, and diabetes, researchers started casting broader nets.

Next up was cancer.





It had been known since the late 1950s that injecting large doses of iron into lab animals could cause malignant tumors, but it wasn’t until the 1980s that scientists began looking for associations between iron and cancer in humans. In 1985, Ernest Graf and John Eton proposed that differences in colon cancer rates among countries could be accounted for by the variation in the fiber content of local diets, which can in turn affect iron absorption.20

The following year, Richard Stevens found that elevated ferritin was associated with triple the risk of death from cancer among a group of 20,000 Chinese men.21 Two years later Stevens showed that American men who developed cancer had higher transferrin saturation and serum iron than men who didn’t.22 In 1990, a large study of Swedish blood donors found that they were 20 percent less likely to get cancer than non-donor controls.23 Four years later, a group of Finnish researchers found that elevated transferrin saturation among 40,000 Scandinavians conferred a threefold increase risk for colorectal cancer, and a 1.5-fold increased risk for lung cancer.24

A host of research articles have been published since Graf and Eton’s first paper, and most have supported an association between iron and cancer—particularly colorectal cancer. In 2001, a review of 33 publications investigating the link between iron and colorectal cancer found that more than 75 percent of them supported the relationship.25 A 2004 study found an increased risk of death from cancer with rising serum iron and transferrin saturation. People with the highest levels were twice as likely to die from cancer than those with the lowest levels.26 And in 2008, another study confirmed that Swedish blood donors had about a 30 percent decrease in cancer risk.27

Happy cake Happy cafe / Shutterstock

There are a few other lines of evidence that support the association between iron and cancer. People with an HFE mutation have an increased risk of developing colon and blood cancers.28 Conversely, people diagnosed with breast, blood, and colorectal cancers are more than twice as likely to be HFE heterozygotes than are healthy controls.29



There are also a handful of interventional trials investigating the relationship between iron and cancer. The first was published in 2007 by a group of Japanese scientists who had previously found that iron reduction via phlebotomy essentially normalized markers of liver injury in patients with hepatitis C. Hepatocellular carcinoma (HCC) is a feared consequence of hepatitis C and cirrhosis, and they hypothesized that phlebotomy might also reduce the risk of developing this cancer. The results were remarkable—at five years only 5.7 percent of patients in the phlebotomy group had developed HCC compared to 17.5 percent of controls. At 10 years the results were even more striking, with 8.6 percent of phlebotomized patients developing HCC compared to an astonishing 39 percent of controls.30

The second study to investigate the effects of phlebotomy on cancer risk was published the following year by Leo Zacharski, a colorful emeritus professor at Dartmouth. In a multi-center, randomized study originally designed to look at the effects of phlebotomy on vascular disease, patients allocated to the iron-reduction group were about 35 percent less likely to develop cancer after 4.5 years than controls. And among all patients who did develop cancer, those in the phlebotomy group were about 60 percent less likely to have died from it at the end of the follow-up period.31





The brain is a hungry organ. Though only 2 to 3 percent of body mass, it burns 20 percent of the body’s total oxygen requirement. With a metabolism that hot, it’s inevitable that the brain will also produce more free radicals as it churns through all that oxygen. Surprisingly, it’s been shown that the brain appears to have less antioxidant capacity than other tissues in the body, which could make it more susceptible to oxidative stress.32 The balance between normal cellular energy metabolism and damage from reactive oxygen species may be even more delicate in the brain than elsewhere in the body. This, in turn, points to a sensitivity to iron.

It’s been known since the 1920s that neurodegenerative disease—illnesses like Alzheimer’s and Parkinson’s—is associated with increased iron deposition in the brain. In 1924, a towering Parisian neurologist named Jean Lhermitte was among the first to show that certain regions of the brain become congested with abnormal amounts of iron in advanced Parkinson’s disease.33 Thirty years later, in 1953, a physician named Louis Goodman demonstrated that the brains of patients with Alzheimer’s disease had markedly abnormal levels of iron deposited in the same regions as the famed plaques and tangles that define the illness.34 Goodman’s work was largely forgotten for several decades, until a 1992 paper resurrected and confirmed his findings and kindled new interest. Two years later an exciting new technology called MRI was deployed to probe the association between iron and disease in living patients, confirming earlier autopsy findings that Alzheimer brains demonstrated significant aberrations in tissue iron.35

Zacharski is convinced that iron overload is a huge common fulcrum underlying much of the chronic metabolic disease that is sweeping Western countries.

By the mid 1990s, there was compelling evidence that Alzheimer’s and Parkinson’s disease involved some dysregulation of iron metabolism in the brain, but no one knew whether the relationship was cause or consequence of the disease process. Hints began trickling in at around the same time the MRI findings were being published. A 1993 paper reported that iron promoted aggregation of amyloid-b, the major constituent of Alzheimer’s plaques.36 In 1997, researchers found that the aberrant iron associated with Alzheimer’s plaques was highly reactive and able to freely generate toxic oxygen radicals.37 By 2010, it had been shown that oxidative damage was one of the earliest detectable changes associated with Alzheimer’s, and that reactive iron was present in the earliest stages of the disease.38,39 And in 2015, a seven-year longitudinal study showed that cerebrospinal fluid ferritin levels were a strong predictor of cognitive decline and development of Alzheimer’s dementia.40



Perhaps most surprising was the discovery in 1999 that the pre-cursor to amyloid-b was under direct control by cellular iron levels—the more iron around, the more amyloid was produced.41 This raised the tantalizing possibility that amyloid plaques might actually represent an adaptive response rather than a cause, an idea that has been indirectly supported by the spectacular failure of essentially all efforts to directly target amyloid protein as treatment for the disease.

Together, these findings suggest that abnormal iron metabolism in the brain could be a causative factor in Alzheimer’s and other neurodegenerative diseases. If that’s true, then we might expect people who are genetically predisposed to an aberrant iron metabolism would be at higher risk of dementing diseases than others. And so they are.

In the early 2000s, it was discovered that patients with familial Alzheimer’s were more likely to possess one of the HFE genes than healthy controls.42 Another study found that these genotypes were associated with earlier onset of the disease compared to controls, and that there was an even more powerful effect in people who an HFE as well as an ApoE4 gene, the primary genetic risk factor for Alzheimer’s disease.43 A 2004 study showed that the co-occurrence of the HFE gene with a known variant in the transferrin gene conferred a fivefold increased risk of Alzheimer’s.44 Two years later a team of Portuguese scientists found that the HFE variants were associated with increased risk of Parkinson’s as well.45

What about interventional trials? For neurodegenerative disease, there has been exactly one. In 1991, a team of Canadian scientists published the results of a two-year randomized trial of the iron chelator desferrioxamine in 48 patients with Alzheimer’s disease.46 Chelators are a class of medication that bind metal cations like iron, sequester them, and facilitate their excretion from the body. Patients were randomly allocated to receive desferrioxamine, placebo, or no treatment. The results were impressive—at two years, iron reduction had cut the rate of cognitive decline in half.

The study was published in The Lancet, one of the world’s most prestigious medical journals, but seems to have been forgotten in the 20-odd year interim. Not a single interventional study testing the role of iron in Alzheimer’s disease has been published since.





If so many studies seem to show a consistent association between iron levels and chronic disease, why isn’t more work being done to clarify the risk?

“It’s incredible that there is so much promising literature, and nobody—nobody—is doing the clinical trials,” Dartmouth’s Zacharski said to me. “If people would just take up the gauntlet and do well-designed, insightful studies of the iron hypothesis, we would have a much firmer understanding of this. Just imagine if it turns out to be verified!”

His perspective on why more trials haven’t been done is fascinating, and paralleled much of what other experts in the field said. “Sexiness,” believe it or not, came up in multiple conversations—molecular biology and targeted pharmaceuticals are hot (and lucrative), and iron is definitively not. “Maybe it’s not sexy enough, too passé, too old school,” said one researcher I spoke to. Zacharski echoed this in our conversation, and pointed out that many modern trials are funded by the pharmaceutical industry, which is keen to develop the next billion-dollar drug. Government agencies like the NIH can step in to fill gaps left by the for-profit research industry, but publically funded scientists are subject to the same sexiness bias as everyone else. As one senior university scientist told me, “NIH goes for fashion.”

Zacharski is convinced that iron overload is a huge common fulcrum underlying much of the chronic metabolic disease that is sweeping Western countries. He thinks that even subtly elevated iron levels can result in free radical formation, which then contribute to chronic inflammation. And chronic inflammation, we know, is strongly linked to everything from heart disease to diabetes, cancer to Alzheimer’s.

“If this doesn’t deserve randomized trials,” he told me, “then I don’t know what does.”

Until those randomized trials arrive—I’ll see you at the blood bank.





Clayton Dalton is an emergency medicine resident at Massachusetts General Hospital in Boston. He has published stories and essays with NPR, Aeon, and The Los Angeles Review.

Lead image: Liliya Kandrashevich / Shuttterstock





References

1. Whittaker, P., Tufaro, P.R., & Rader. J.I. Iron and folate in fortified cereals. The Journal of the American College of Nutrition 20, 247-254 (2001).

2. Halliwell, B. & Gutteridge, J.M. Oxygen toxicity, oxygen radicals, transition metals and disease. Biochemical Journal 219, 1-14 (1984).

3. Connor, J.R. & Ghio, A.J. The impact of host iron homeostasis on disease. Preface. Biochimica et Biophysica Acta 1790, 581-582 (2009).

4. Hanson, E.H., Imperatore, G., & Burke, W. HFE gene and hereditary hemochromatosis: a HuGE review. Human Genome Epidemiology. American Journal of Epidemiology 154, 193-206 (2001).

5. Beutler, E., Felitti, V.J., Koziol, J.A., Ho, N.J., & Gelbart, T. Penetrance of 845G—> A (C282Y) HFE hereditary haemochromatosis mutation in the USA. The Lancet 359, 211-218 (2002).

6. Rossi, E., et al. Effect of hemochromatosis genotype and lifestyle factors on iron and red cell indices in a community population. Clinical Chemistry 47, 202-208 (2001).

7. Roest, M., et al. Heterozygosity for a hereditary hemochromatosis gene is associated with cardiovascular death in women. Circulation 100, 1268-1273 (1999).

8. Tuomainen, T.P., et al. Increased risk of acute myocardial infarction in carriers of the hemochromatosis gene Cys282Tyr mutation: A prospective cohort study in men in eastern Finland. Circulation 100, 1274-1279 (1999).

9. Pereira, A.C., et al. Hemochromatosis gene variants in patients with cardiomyopathy. American Journal of Cardiology 88, 388-391 (2001).

10. Muñoz-bravo, C., Gutiérrez-bedmar, M., Gómez-aracena, J., García-rodríguez, A., & Navajas, J.F. Iron: protector or risk factor for cardiovascular disease? Still controversial. Nutrients 5, 2384-2404 (2013).

11. Tuomainen, T.P., et al. Body iron stores are associated with serum insulin and blood glucose concentrations. Population study in 1,013 eastern Finnish men. Diabetes Care 20, 426-428 (1997).

12. Ford, E.S. & Cogswell, M.E. Diabetes and serum ferritin concentration among U.S. adults. Diabetes Care 22, 1978-1983 (1999).

13. Jehn, M., Clark, J.M., & Guallar, E. Serum ferritin and risk of the metabolic syndrome in U.S. adults. Diabetes Care 27, 2422-2428 (2004).

14. Ellervik, C., et al. Elevated transferrin saturation and risk of diabetes: three population-based studies. Diabetes Care 34, 2256-2258 (2011).

15. Bonfils, L., et al. Fasting serum levels of ferritin are associated with impaired pancreatic beta cell function and decreased insulin sensitivity: a population-based study. Diabetologia 58, 523-533 (2015).

16. Facchini, F.S. Effect of phlebotomy on plasma glucose and insulin concentrations. Diabetes Care 21, 2190 (1998).

17. Fernández-real, J.M., López-bermejo, A., & Ricart, W. Iron stores, blood donation, and insulin sensitivity and secretion. Clinical Chemistry 51, 1201-1205 (2005).

18. Gabrielsen, J.S., et al. Adipocyte iron regulates adiponectin and insulin sensitivity. Journal of Clinical Investigation 122, 3529-3540 (2012).

19. Houschyar, K.S., et al. Effects of phlebotomy-induced reduction of body iron stores on metabolic syndrome: results from a randomized clinical trial. BMC Medicine 10:54 (2012).

20. Graf, E. & Eaton, J.W. Dietary suppression of colonic cancer. Fiber or phytate?. Cancer 56, 717-718 (1985).

21. Stevens, R.G., Beasley, R.P., & Blumberg, B.S. Iron-binding proteins and risk of cancer in Taiwan. Journal of the National Cancer Institute 76, 605-610 (1986).

22. Stevens, R.G., Jones, D.Y., Micozzi, M.S., & Taylor, P.R. Body iron stores and the risk of cancer. New England Journal of Medicine 319, 1047-1052 (1988).

23. Merk, K., et al. The incidence of cancer among blood donors. International Journal of Epidemiology 19, 505-509 (1990).

24. Knekt, P., et al. Body iron stores and risk of cancer. International Journal of Cancer 56, 379-382 (1994).

25. Nelson, R.L. Iron and colorectal cancer risk: human studies. Nutrition Review 59, 140-148 (2001).

26. Wu, T., Sempos, C.T., Freudenheim, J.L., Muti, P., & Smit, E. Serum iron, copper and zinc concentrations and risk of cancer mortality in US adults. Annals of Epidemiology 14, 195-201 (2004).

27. Edgren, G., et al. Donation frequency, iron loss, and risk of cancer among blood donors. Journal of the National Cancer Institute 100, 572-579 (2008).

28. Nelson, R.L., Davis, F.G., Persky, V., & Becker, E. Risk of neoplastic and other diseases among people with heterozygosity for hereditary hemochromatosis. Cancer 76, 875-879 (1995).

29. Weinberg, E.D. & Miklossy, J. Iron withholding: a defense against disease. Journal of Alzheimer’s Disease 13, 451-463 (2008).

30. Kato, J., et al. Long-term phlebotomy with low-iron diet therapy lowers risk of development of hepatocellular carcinoma from chronic hepatitis C. Journal of Gastroenterology 42, 830-836 (2007).

31. Zacharski, L.R., et al. Decreased cancer risk after iron reduction in patients with peripheral arterial disease: results from a randomized trial. Journal of the National Cancer Institute 100, 996-1002 (2008).

32. Lee, H.G., et al. Amyloid-beta in Alzheimer disease: the null versus the alternate hypotheses. Journal of Pharmacology and Experimental Therapeutics 321, 823-829 (2007).

33. Lhermitte, J., Kraus, W.M., & Mcalpine, D. Original Papers: On the occurrence of abnormal deposits of iron in the brain in Parkinsonism with special reference to its localisation. Journal of Neurology and Psychopathology 5, 195-208 (1924).

34. Goodman, L. Alzheimer’s disease; a clinico-pathologic analysis of twenty-three cases with a theory on pathogenesis. The Journal of Nervous and Mental Disease 118, 97-130 (1953).

35. Bartzokis, G., et al. In vivo evaluation of brain iron in Alzheimer’s disease and normal subjects using MRI. Biological Psychiatry 35, 480-487 (1994).

36. Mantyh, P.W., et al. Aluminum, iron, and zinc ions promote aggregation of physiological concentrations of beta-amyloid peptide. Journal of Neurochemistry 61, 1171-1174 (1993).

37. Smith, M.A., Harris, P.L., Sayre, L.M., & Perry, G. Iron accumulation in Alzheimer disease is a source of redox-generated free radicals. Proceedings of the National Academy of Sciences 94, 9866-9868 (1997).

38. Nunomura, A., et al. Oxidative damage is the earliest event in Alzheimer disease. Journal of Neuropathology and Experimental Neurology 60, 759-767 (2001).

39. Smith, M.A., et al. Increased iron and free radical generation in preclinical Alzheimer disease and mild cognitive impairment. Journal of Alzheimer’s Disease 19, 363-372 (2010).

40. Ayton, S., Faux, N.G., & Bush, A.I. Ferritin levels in the cerebrospinal fluid predict Alzheimer’s disease outcomes and are regulated by APOE. Nature Communications 6:6760 (2015).

41. Rogers, J.T., et al. Translation of the alzheimer amyloid precursor protein mRNA is up-regulated by interleukin-1 through 5’-untranslated region sequences. Journal of Biological Chemistry 274, 6421-6431 (1999).

42. Moalem, S., et al. Are hereditary hemochromatosis mutations involved in Alzheimer disease? American Journal of Medical Genetics 93, 58-66 (2000).

43. Combarros, O., et al. Interaction of the H63D mutation in the hemochromatosis gene with the apolipoprotein E epsilon 4 allele modulates age at onset of Alzheimer’s disease. Dementia and Geriatric Cognitive Disorders 15, 151-154 (2003).

44. Robson, K.J., et al. Synergy between the C2 allele of transferrin and the C282Y allele of the haemochromatosis gene (HFE) as risk factors for developing Alzheimer’s disease. Journal of Medical Genetics 41, 261-265 (2004).

45. Pulliam, J.F., et al. Association of HFE mutations with neurodegeneration and oxidative stress in Alzheimer’s disease and correlation with APOE. American Journal of Medical Genetics; Part B 119B, 48-53 (2003).

46. Crapper-McLachlan, D.R., et al. Intramuscular desferrioxamine in patients with Alzheimer’s disease. The Lancet 337, 1304-1308 (1991).