When President Obama delivered a speech at MIT in 2009, he used a common science metaphor: “We have always been about innovation,” he said. “We have always been about discovery. That’s in our DNA.” Deoxyribonucleic acid, the chemical into which our genes are encoded, has become the metaphor of choice for a whole constellation of ideas about essence and identity. A certain mystique surrounds it. As Evelyn Fox Keller argues in her book The Century of the Gene, the genome is, in the popular imagination at least, the secret of life, the holy grail. It is a master builder, the ultimate computer program, and a modern-day echo of the soul, all wrapped up in one. This fantasy does not sit easily, however, with geneticists who have grown more aware over the last several decades that the relationship between genes and biological traits is much less than certain.
The popular understanding of DNA as a blueprint for organisms, with a one-to-one correspondence between genes and traits (called phenotypes), is the legacy of the early history of genetics. The term “gene” was coined in 1909 to refer to abstract units of inheritance, predating the discovery of DNA by forty years. Biologists came to think of genes like beads on a string that lined up neatly into chromosomes, with each gene determining a single phenotype. But, while some genes do correspond to traits in a straightforward way, as in eye color or blood group, most phenotypes are far more complex, set in motion by many different genes as well as by the environment in which the organism lives.
It turns out that the genetic code is less like a blueprint and more like a movie script, subject to revision and reinterpretation by a director. This process is called epigenetic modification (“epi” meaning “above” or “in addition to”). Just as a script can be altered with crossed-out words, sentences or scenes, epigenetic editing allows entire sections of DNA to be activated or de-activated. Genes can be as finely tuned as actors responding to stage directions to shout, whisper, or cackle.
These directions are encoded through what are essentially little deposits of specific chemicals onto our DNA and proteins called histones, around which our DNA is wrapped like Christmas lights around a tube. The most common form of epigenetic editing is called methylation, in which a methyl group (one carbon and three hydrogen atoms) latches on to either the DNA or the histone. DNA methylation switches off gene expression, while histone methylation may increase or decrease gene expression, depending on the type of methylation and where on a histone it’s deposited.
These modifications don’t change the underlying genetic code, but influence how genes are expressed. Some modifications can last a lifetime, keeping certain genes switched off forever, like whited out lines running through entire scenes of our genetic script. If you don’t have a tooth growing in your eye or a liver in your knee, thank your epigenome: the differences controlling cell lineage are dictated predominantly by DNA methylation. Other modifications are more like the Post-it notes, influencing incremental degrees of gene expression depending on environmental circumstances—allowing cells to react quickly to changes. Permanent changes often involve large amounts of methylation, which make the DNA cling so tightly to the histone that it can’t be read, rather like glued-together pages of a movie script.
The long-running debate about nature versus nurture takes on a new complexion in light of this emerging understanding of how epigenetics affects organisms’ form and function. The boundary between nature and nurture turns out to be rather porous. For example, when the dominant female in a school of clownfish dies, an epigenetic modification occurs in the male that increases the release of particular hormones and switches his sex to female. Similarly, the difference between male and female crocodiles is not encoded in their DNA; rather, incubation temperatures of 32-33 degrees Celsius will produce males, while females predominate at 30 degrees Celsius and below, thanks to a mechanism like that in the clownfish. Epigenetics has been offered as an explanation for why genetically identical mice kept in identical conditions can have a range of body weights, or why one of a pair of identical twins can develop schizophrenia, while the other remains perfectly healthy. Stress, diet, and exposure to environmental toxins are thought to be some of the common epigenetic triggers in humans, although the processes are still poorly understood.
All creatures, then, live in two histories at once: their inherited evolutionary history, built up by their ancestors over many thousands of years, and their personal organic history, lived through a single life cycle. Epigenetics is a bridge between these two domains, adapting evolutionary advantages to the demands of the present moment. It etches the uncertainty of our life experience directly onto our biology. In fact, philosophers of genetics Paul Griffiths and Karola Stolz have gone so far as to suggest that we abandon the idea of genes as fixed units of information and define genes instead as “things you can do with your genome.”
It turns out that the genetic code is less like a blueprint and more like a movie script, subject to revision and reinterpretation by a director.
Consider the example of alcohol. Inexperienced drinkers get tipsy fast. But later in life, if you continue down this dissolute route, your alcohol tolerance increases. To keep up with an increased drinking habit, the liver adjusts the epigenetic information on the genes that produce alcohol-metabolizing proteins, “turning them up” so that alcohol is broken down faster. If we cut back on our booze, the liver “turns down” those genes’ expression, since it would be a waste of the body’s resources to produce an excess of those proteins. Epigenetic modifications may also contribute to alcohol cravings: when alcohol is consumed over a period of time, it appears to prime a gene that is responsible for higher levels of neuropeptide Y, a neurotransmitter that helps manage anxiety. But if the drinker turns down her consumption, the gene that modulates neuropeptide Y switches off, leading to the anxiety associated with alcohol withdrawal.
As scientists learn more about these epigenetic acrobatics, new medical treatments are emerging. Research suggests that chronic diseases with no obvious genetic roots may be the result of genes locked into abnormal patterns of expression by epigenetics. For example, Zolinza is a new drug licensed to treat a blood tumor called cutaneous T cell lymphoma. Zolinza inhibits enzymes that remove acetyl molecules from epigenetically-enhanced histones, allowing acetyl groups to build up and switch on certain tumor-suppressing genes. Vidaza, a drug used to treat the preleukemic condition known as myelodysplastic syndrome, works in a similar way by inhibiting DNA methylation, which also turns on tumor-suppressing genes.
But many of the mechanisms underlying epigenetics continue to resist our understanding.
Among these is the heritability of epigenetic modifications. A strange phenomenon was observed among children who were born after the Dutch famine of 1944, known as the “Hongerwinter” or Hunger Winter. The occupying Germans had blockaded the northern and western regions of Holland as a form of collective punishment, and food became painfully scarce. After the war, scientists discovered that children conceived during the Hongerwinter had a higher than normal risk of obesity, cardiovascular disease, and other health problems, compared to their siblings conceived before or after the famine. Surprisingly, so did their children—even though the second generation hadn’t experienced the original episode of malnutrition, even in the womb. Similar phenomena were observed among survivors of the Biafra famine in Nigeria between1968-1970, and the Chinese famine of 1958-1961 during Mao’s Great Leap Forward. The most likely explanation for these cases is that epigenetic changes caused by the original episodes of famine were passed from parent to child. Recently, scientists have found signs of methylation in the genomes of Hongerwinter survivors, including on those genes associated with cholesterol transport and aging.
Chronic diseases with no obvious genetic roots may be the result of genes locked into abnormal patterns of expression by epigenetics.
When and why epigenetic modifications pass on to the next generation is unclear. Enzymes strip most of the epigenetic data from our DNA when it is funneled into producing sperm or eggs; those cells only contain epigenetic traces necessary for their own functioning, not for the phenotypes of the organism that they will go on to produce. When the sperm and egg merge to form a single cell, another wave of enzyme activity strips off epigenetic data specific to the egg and sperm, and replaces it with information which controls the gene expression required for the fetus’s early development. But apparently there are tracts of DNA where epigenetic information from the parents stubbornly persists. But we don’t know what that information is, or where it is stored.
The notion that creatures’ life experiences could shape the next generation is actually a very old idea. The fabled Greek physician of antiquity, Hippocrates, wrote of a quasi-mythical race of people known as the Macrocephali that had succeeded in cultivating long heads by molding the soft skulls of their newborn babies. The French naturalist Jean-Baptiste Lamarck picked up on this concept in the early 19th century, proposing that organisms acquire characteristics during their lifetime that can be passed on to offspring, although his ideas were widely discredited by Darwin’s successors. When evidence supporting complex patterns of epigenetic modification emerged in the 20th century, many biologists remained resistant.
In part, this reluctance is a familiar story in every scientific field where new ideas challenge long-entrenched theories. But perhaps part of the initial aversion to epigenetics was motivated by something in our cultural consciousness. Epigenetics undermines age-old ideas of the organism, particularly the human being, as having a stable essence—whether it is a divine soul, a curled-up miniature being waiting to unfold into a fully formed adult, or a molecular program from which we can read off a biologically predestined future. The claim that “it’s in our DNA,” it seems, no longer offers the reassuring bedrock of certainty that we once thought it did.
This article was originally published in Nautilus Magazine on June 6, 2013.