Features:

Beyond DNA: Epigenetics Deciphering the link between nature and nurture

Kryn Taconis/Library and Archives Canada/PA-169941

Excerpted from The Epigenetics Revolution: How Modern Biology Is Rewriting Our Understanding of Genetics, Disease, and Inheritance, by Nessa Carey (Columbia University Press, 2012). Copyright © 2012 Nessa Carey.

We talk about DNA as if it’s a template, like a mold for a car part in a factory. In the factory, molten metal or plastic gets poured into the mold thousands of times, and, unless something goes wrong in the process, out pop thousands of identical car parts.

But DNA isn’t really like that. It’s more like a script. Think of Romeo and Juliet, for example. In 1936 George Cukor directed Leslie Howard and Norma Shearer in a film version. Sixty years later Baz Luhrmann directed Leonardo DiCaprio and Claire Danes in another movie version of this play. Both productions used Shakespeare’s script, yet the two movies are entirely different. Identical starting points, different outcomes.

That’s what happens when cells read the genetic code that’s in DNA. The same script can result in different productions. The implications of this for human health are very wide-ranging, as we will see from the case studies we are going to look at in a moment. In all these case studies it’s really important to remember that nothing happened to the DNA blueprint of the people in these case studies. Their DNA didn’t change (mutate), and yet their life histories altered irrevocably in response to their environments.

Audrey Hepburn was one of the twentieth century’s greatest movie stars. Stylish, elegant, and with a delicately lovely, almost fragile bone structure, she became an icon in her role as Holly Golightly in Breakfast at Tiffany’s, even to those who have never seen the movie. It’s startling to think that this wonderful beauty was created by terrible hardship. Audrey Hepburn was a survivor of an event in World War II known as the Dutch Hunger Winter. This ended when she was sixteen years old, but the aftereffects of that period, including poor physical health, stayed with her for the rest of her life.

The Dutch Hunger Winter lasted from the start of November 1944 to the late spring of 1945. That was a bitterly cold period in Western Europe, creating further hardship on a continent that had been devastated by four years of brutal war. Nowhere was this worse than in the western Netherlands, which at this stage was still under German control. A German blockade resulted in a catastrophic drop in the availability of food to the Dutch population. At one point the population was try­ing to survive on only about 30 percent of the normal daily calorie intake. People ate grass and tulip bulbs, and burned every scrap of furniture they could get their hands on, in a desperate effort to stay alive. More than 20,000 people had died by the time food supplies were restored in May 1945.

The dreadful privations of that time also created a remarkable scientific study population. The Dutch sur­vivors were a well-defined group of individuals all of whom suffered just one period of malnutrition, all of them at exactly the same time. Because of the excellent health-care infrastructure and record-keeping in the Netherlands, epidemiologists have been able to follow the long-term effects of the famine. Their findings were completely unexpected.

One of the first aspects they studied was the effect of the famine on the birth weights of children who had been in the womb during that terrible period. If a mother was well fed around the time of conception and malnourished only for the last few months of the pregnancy, her baby was likely to be born small. If, on the other hand, the mother suffered malnutrition only for the first three months of the pregnancy (because the baby was conceived toward the end of the terrible episode), but then was well fed, she was likely to have anormal-size baby. The fetus “caught up” in body weight.

That all seems quite straightforward, as we are all used to the idea that fetuses do most of their growing in the last few months of pregnancy. But epi­demiologists were able to study these groups of babies for decades, and what they found was really surprising. The babies who were born small stayed small all their lives, with lower obesity rates than the general popula­tion. For forty or more years, those people had access to as much food as they wanted, and yet their bodies never got over the early period of malnutrition. Why not? How did their early life experiences affect these individuals for decades? Why weren’t they able to go back to normal once their environment reverted to the way it should be?

More unexpectedly, the children whose mothers had been malnourished only early in pregnancy had higher obesity rates than normal. Recent reports have shown a greater incidence of other health problems as well, including effects on certain measures of mental health. Even though those individuals had seemed perfectly healthy at birth, something had happened to their devel­opment in the womb that affected them for decades after. And it wasn’t just the fact that something had happened that mattered, it was when it happened. Events that take place in the first three months of gestation, a stage when the fetus is really very small and developing very rapidly, can affect an individual for the rest of his or her life.

Even more extraordinarily, some of these effects seem to be present in the children of this group, that is, in the grandchildren of the women who were malnour­ished during the first three months of their pregnancy. So something that happened in one pregnant population affected their children’s children. That raised the really puzzling question of how those effects were passed on to subsequent generations.

Let’s consider a different human story. Schizo­phrenia is a dreadful mental illness, which, if un­treated, can completely overwhelm and disable an affected person. Patients may present with a range of symptoms including delusions, hallucinations, and enormous difficulties focusing mentally. People with schizophrenia may become completely incapable of dis­tinguishing between the “real world” and their own hallucinatory and delusional realm. Normal cogni­tive, emotional, and societal responses are lost. There is a terrible misconception, however, that people with schizophrenia are likely to be violent and dangerous. For the great majority of patients that isn’t the case at all, and the people most likely to suffer harm because of this illness are the patients themselves. Individuals with schizophrenia are fifty times as likely to attempt suicide as healthy individuals.

Schizophrenia is tragically common. It affects between 0.5 and 1 percent of the population in most countries and cultures, which means that there may be more than 50 million people alive today who are suffering from this condition. Scientists have known for some time that genetics plays a strong role in determining if a person will develop this illness. We know this because if one of a pair of identical twins has schizophrenia, there is a 50 percent chance that their twin will also have the condition. That is much higher than the 1 percent risk in the general population or even the 15 percent risk for fraternal twins. Identical twins have exactly the same genetic code as each other. They share the same womb, and usually they are brought up in very similar environments. When we consider this, it doesn’t seem surprising that if one of the twins develops schizophrenia, the chance that his or her twin will also develop the illness is very high. In fact, we have to start wondering why it isn’t higher. Why isn’t the figure 100 percent? How is it that two apparently identical individuals can become so very different? An individual has a devastating mental illness, but will his or her identical twin suffer from it too? Flip a coin—heads they win, tails they lose. Variations in the environment are unlikely to account for this, and even if they did, how would those environmental effects have such profoundly different impacts on two genetically identical people?

Here’s a third case study. A small child, less than three years old, is abused and neglected by his or her parents. Eventually, the state intervenes, and the child is taken away from the biological parents and placed with foster or adoptive parents. These new caregivers love and cherish the child, doing everything they can to create a secure home, full of affection. The child stays with these new parents throughout the rest of his or her childhood and adolescence, and into young adulthood.

Sometimes everything works out well for such chil­dren. They grow up into happy, stable individuals indistinguishable from all their peers who had normal, non-abusive childhoods. But often, tragically, it doesn’t work out this way. Children who have suffered from abuse or neglect in their early years grow up with a substantially higher risk of adult mental health problems than the gen­eral population. All too often such a child grows up into an adult at high risk of depression, self-harm, drug abuse, and suicide.

Once again, we have to ask ourselves why. Why is it so difficult to override the effects of early childhood exposure to neglect or abuse? Why should something that happened early in life have effects on mental health that may still be obvious decades later? In some cases, the adult may have absolutely no recollection of the traumat­ic events, and yet he or she may suffer the consequences mentally and emotionally for the rest of life.

These three case studies seem very different on the surface. The first is mainly about nutrition, especially of the unborn child. The second is about the differences that arise between genetically identical individuals. The third is about long-term psychological damage as a result of childhood abuse.

But these stories are linked at a very fundamental bio­logical level. They are all examples of epigenetics. Epigenetics is the new discipline that is revolutionizing biol­ogy. Whenever two genetically identical individuals are nonidentical in some way we can measure, this is called epigenetics. When a change in environment has biological consequences that last long after the event itself has vanished into distant memory, we are seeing an epigenetic effect in action.

When scientists talk about epigenetics they are referring to all the cases in which the genetic code alone isn’t enough to describe what’s happening—there must be something else going on as well. That is one of the ways that epigenetics is described scientifically: where things that are genetically identical can actually appear quite different from one another. But there has to be a mechanism that brings out this mismatch between the genetic script and the final outcome. Epigenetic effects must be caused by some sort of physical change, some alterations in the vast array of molecules that make up the cells of every living organ­ism. That leads us to the other scientific way of viewing epigenetics—the molecular description. In this model, epigenetics can be defined as the set of chemical modifications surrounding and attaching to our genetic material that change the ways genes are switched on or off, but don’t alter the genes themselves.

Although it may seem confusing that the word “epigenetics” can have two different meanings, it’s just because we are describing the same event at two different levels. It’s a bit like looking at the pictures in old news­papers with a magnifying glass, and seeing that they are made up of dots. If we didn’t have a magnifying glass we might have thought that each picture was just made in one solid piece, and we’d probably never have been able to work out how so many new images could be created each day. On the other hand, if all we ever did was look through the magnifying glass, all we would see would be dots, and we’d never see the incredible image that they formed together and that we’d see if we could only step back and look at the big picture.

The revolution that has happened very recently in biology is that for the first time we are actually starting to understand how amazing epigenetic phenomena are caused. We’re no longer just seeing the large image, we can now also analyze the individual dots that created it. Crucially, this means that we are finally starting to unravel the missing link between nature and nurture: how our environment talks to us and alters us, sometimes forever.

The “epi” in epigenetics is derived from Greek and means at, on, to, upon, over, or beside. The DNA in our cells is not some pure, unadulterated molecule. Small chemical groups can be added at specific regions of DNA. Our DNA is also smothered in special proteins. These proteins can themselves be covered with additional small chemicals. None of these molecular amendments—which I will explore in the next issue of Natural History—changes the underlying genetic code. But adding these chemical groups to the DNA, or to the associated proteins, or removing Scientists in both the academic and commercial sectors are also waking up to the enormous impact them, changes the expression of nearby genes. These changes in gene expression alter the functions of cells, and the very nature of the cells themselves. Sometimes, if these patterns of chemical modifications are put on or taken off at a critical period in development, the pattern can be set for the rest of our lives, even if we live to be over a 100 years of age.

There’s no debate that the DNA blueprint is the starting point—a very important starting point and absolutely necessary, without a doubt. But it isn’t a sufficient explanation for all the sometimes wonderful, sometimes awful complexity of life. If the DNA sequence were all that mattered, identical twins would always be absolutely identical in every way. Babies born to malnourished mothers would gain weight as easily as other babies who had a healthier start in life. And we would all look like big amorphous blobs, because all the cells in our bodies would be completely identical. That’s because epigenetics is the mechanism by which cells with the same genetic code express different parts of it during development, becoming liver, muscle, brain, or any of the hundreds other cell types in the human body.

Huge areas of biology are influenced by epigenetic mechanisms, and the revolution in our thinking is spreading further and further into unexpected frontiers of life on our planet. Why can’t we make a baby from two sperm or two eggs, but must have one of each? What makes cloning possible? Why is cloning so difficult? Why do some plants need a period of cold before they can flower? Since queen bees and worker bees are genetically identical, why are they completely different in form and function? Why are virtually all tortoiseshell cats female? Why is it that humans contain trillions of cells in hundreds of complex organs, and microscopic worms contain about a thousand cells and only rudimentary organs, but we and the worm have the same number of genes?

Scientists in both the academic and commercial sectors are also waking up to the enormous impact that epigenetics has on human health. It’s implicated in diseases from schizophrenia to rheumatoid arthritis, from cancer to chronic pain. There are already two types of drugs that successfully treat certain cancers by interfering with epigenetic processes. Pharmaceutical companies are spending hundreds of millions of dollars in a race to develop the next generation of epigenetic drugs to treat some of the most serious illnesses afflicting the industrialized world. Epigenetic therapies are the new frontier of drug discovery.

In biology, Darwin and Mendel came to define the nineteenth century as the era of evolution and genetics. Watson and Crick defined the twentieth century as the era of DNA, and the functional understanding of how genetics and evolution interact. But in the twenty-first century, it is the new scientific discipline of epigenetics that is deconstructing so much of what we took as dogma and rebuilding it in an infinitely more varied, more complex, and even more beautiful fashion.