What is genetic screening in populations?

What is genetic screening in populations?

About 3 billion years ago, the experiment known as “life” began on Earth. All living creatures from the five kingdoms are descended from a single common ancestor. We know that all creatures on this planet are related to one another because we all share the same digital “language of life,” known as DNA and RNA, made from five simple nucleic acids, connected along a sugar phosphate chain. If we could meet our primordial progenitor, we might not recognize it as living at all. We suspect it was nothing more than an RNA receptor-catalyst that somehow replicated itself by consuming chemicals in its immediate environment. These simple “ribo-organisms” were inherently unstable, forming and falling apart quite easily. Only over hundreds of millions of years did these organisms gain more stability when single-stranded RNA evolved into double-stranded DNA, and even more stability when the DNA began coding for proteins that would eventually evolve into cellular structure.

Inherent in this narrative is a central fact that is easy to overlook: life began and continues to evolve within specific environments. We all know the classic riddle, “Which came first, the chicken or the egg?” From the perspective of evolutionary biology, this is not much of a riddle because the chicken’s ancestors and their eggs preceded the arrival of the chicken by at least 100 million years. However on closer reflection, the riddle asks a far deeper and more perplexing question, “What is the relationship between the individual and its environment, between the chicken and its egg?” A hospitable environment had to precede the development of life, and no new life can evolve unless there is an environment to support it, but environments change over time, and a species must change to accommodate the new changes in its environment or invariably become extinct, as the vast majority of species that have lived on this planet have done. The central premise of Charles Darwin’s grand theory of the origin of species speaks of survival of the fittest, but over time the fittest species is the one that can adapt best to a changing environment. Adaptation is not exclusive to the development of new species, but also plays a critical role in the survival of any individual within a species as well.

The twenty-first century may well be remembered as the century in which science first truly began to understand the complex interaction between the genetic information inherent in every individual and the environment to which that individual is exposed.

“Environment” is broadly understood in genetics to include everything that is not the genetic information, or genome, itself. Environment includes climate, physical surroundings, exogenous chemicals or toxins, and infectious agents, but it also includes diet, lifestyle, and behavioural factors. Scientists once believed that genes were immutable archives of digital information that simply coded for proteins, which in turn determined the structure and function of an individual’s (analogue) body. However, it is now irrefutable that gene expression is substantially affected by and sensitive to environmental change. Genes quite literally respond to the environment to which they are exposed. Because genes respond to the environment we subject them to, the environment itself can be proactively manipulated to alter gene expression, and therefore, to change the health state of the individual. This is the central premise of preventive or functional genomics.

In the interactive symphony between genes and environment, the balance between the two may be altered whenever either changes. Because the environment is inherently unpredictable, one effective strategy to increase the chances of survival has been promoting genetic variation within a species, because slightly altered individuals may survive environmental change, whereas others may not. It is rather like a strategy for winning a lottery: odds of winning increase linearly with the greater variety of numbers that one is able to choose. Variety may truly be the spice of life. If populations from a single species diverge into different environments, the selective pressure over time may eventually lead to the creation of separate species.

The human genome is composed of approximately 3 billion nucleotides of genetic code. If you compare your DNA with the next person you happen to meet, you would find that about 1 in every 1000 nucleotides are different. That means that there are 3 million reasons why the two of you are not the same. These subtle variations of the genetic code are known as polymorphisms (literally, “many shapes,” because if the change in the genetic code results in an amino acid substitution, the resulting proteins will have different shapes and slightly altered functions as well). It is these polymorphisms that are largely responsible for our biochemical individuality. Our polymorphisms help make us unique individuals. There are many types of polymorphisms, but by far the most common is the single-nucleotide polymorphism (abbreviated SNP and pronounced “snip”), in which a single nucleotide of the DNA is altered. The sum total of all of an individual’s polymorphisms significantly affects protein synthesis and physiologic function, rendering each individual biologically and biochemically unique.

The theory of natural selection has two central tenets: (1) all organisms compete for limited resources, and (2) organisms with some advantage in acquiring those resources are more likely to survive, to thrive, and to pass on that advantage to their offspring. Polymorphic variations are the ultimate source of these advantages. All polymorphisms that are found to be common in a species must afford some adaptive advantage to survival in some specific environment. Genetic polymorphisms are preserved and become more prevalent in a species when they endow a better chance of survival or of reproduction. The cumulative weight of slight genetic variations over time is the means by which variability within a species arises and by which new species also emerge. However, what may be good for a species and its evolution may not be good for a specific individual, because the environmental pressures exerted on a species over millions of years may be very different from the environment encountered in the present.

In individuals, altered polymorphic genes (the legacy of evolution) produce altered proteins, and altered proteins exhibit altered functions. For any given individual, altered protein function may be beneficial, neutral, or harmful, depending on the environment to which he or she is exposed. Prevalent polymorphisms are likely to be beneficial in certain environments but harmful in others. Because we cannot change our genes, the goal of preventive genomics is to alter an individual’s environment based on his or her specific genetic variations to optimize his or her genetic potential. Genes themselves cannot be modified, but gene expression can be. One of the common misconceptions of polymorphisms is that they reveal only limitations, but in reality, they reveal an individual’s potential.

From a clinical perspective, changes in diet are one major way of altering the gene–environment balance and improving an individual’s health. Two distinct but interrelated fields of inquiry and knowledge are emerging. Nutrigenetics, also referred to as personalized nutrition, is based on the understanding that our genetic polymorphisms change the way we respond physiologically to specific nutrients. By studying specific polymorphisms and their physiologic response to specific nutrients, we can determine a more optimal nutritional regimen for a specific individual based on his or her specific polymorphisms. Nutrigenomics, by contrast, focuses on the effects of specific nutrients (both macro and micro) on the genome as a whole, and subsequently on the body’s resulting total pool of proteins (the proteome) and on all its subsequent metabolic activity (the metabolome) as well.

Food is, by definition, derived from other living organisms. Inherent in any food is an information code that we read and interpret by eating that food. There is a literal exchange of information that passes between the eaten and the eater. Not surprisingly, central to life is our ability to coordinate our metabolic activities with nutrient availability. Signals from the exterior world (food, toxins, weather, stress, etc.) turn on and turn off specific metabolic processes to improve our chances of survival in an ever-changing world.

The most extreme example of the interaction of food availability and metabolic change is the situation, common in nature, when no food at all is available, i.e., famine. Clive McCay at Cornell University in the 1930s found that by restricting calorie intake in rats by at least 25% from the free-feeding level, he could substantially increase their average and extreme life expectancies, delay the age of tumour onset, delay cessation of reproductive function, and preserve functional homeostasis or stress response capacity. Since that time, no major investigation has failed to demonstrate significant benefits to health and longevity from calorie restriction. Calorie restriction lowers the level of insulin exposure, which in turn lowers the overall growth factor exposure, improves age-declining maintenance of mitochondrial function, and helps to maintain a long-term favourable balance of the insulin-to-growth hormone antagonism. From an evolutionary perspective, this makes intuitive sense. Only when calories are abundant does an individual have the metabolic resources for growth and reproduction. Survival during times of famine necessitates metabolic shifts that conserve resources and promote repair, because growth is not an option.

Reducing calorie intake not only lowers overall insulin exposure but is now known to activate the transcription of a family of genes known as sirtuins (SIRT). The SIRT enzymes appear to have first arisen in primordial eukaryotes, possibly to help them cope with adverse conditions, like famine, and today are found in all plants, yeast, and animals. In response to calorie restriction, SIRT-1 stimulates the production of new mitochondria in skeletal muscle and liver cells, thereby increasing the capacity for metabolic repair and energy production. New mitochondria produce fewer free radicals than the old mitochondria, which leads to less free-radical damage and to delayed onset of metabolic aging. SIRT-1 also has a cascading effect on multiple genes, leading to increased catalase activity, increased free fatty acid oxidation for energy, and reduced inflammation via suppression of the enzyme nuclear factor-κβ (NF-κβ).

The obvious problem with calorie restriction, however, is that we like food. Recent studies on the effects of the polyphenol compound resveratrol (initially isolated from grape skins) on obese, sedentary mice, suggested that it could mimic the beneficial health effects of calorie restriction even while the rats continued to eat a high-calorie, high-fat diet and did not exercise. Dietary supplementation with resveratrol was found to oppose the effects of the high-calorie diet in 144 of 153 altered biochemical pathways, most of which could be attributed to its activation of the transcription of the enzyme SIRT-1. Resveratrol increased insulin sensitivity, reduced insulin-like growth factor-1 production, increased adenosine monophosphate activated protein kinase, increased peroxisome proliferator-activated receptor-coactivator-1 (PPAR-1) activity, increased the number of mitochondria, and improved overall motor function. Such research suggests that at some future time we just may be able to have our cake, eat it too, and suffer few of the metabolic consequences for overeating. Nevertheless, the best current nutritional advice for longevity and health remains simply, “eat less.” Not only does it lower insulin load, but it activates the transcription of numerous enzymes that promote mitochondrial regeneration, health, and longevity.

Not only is nutrigenomics helping to elucidate the myriad effects of specific nutrients on our genome, but it is also helping us rethink and understand the mechanisms by which other nutrients act on our physiology. Ginkgo biloba is a commonly prescribed medicinal herb that is known to increase peripheral microcirculation and to contain rather potent antioxidants. The vasodilation allows delivery of these antioxidant compounds to poorly vascularized areas, like the brain. Not surprisingly, ginkgo is commonly used for impaired memory and mental function as we age. Using gene chip assays, researchers examined cellular extracts to look for altered levels of messenger RNA, an accurate measure of gene activity. In vitro studies with human bladder cancer cells that were incubated with ginkgo resulted in suppression of gene transcription by more than 50% in 16 genes and induction of 139 genes by more than 100%. The overall effect of adding ginkgo was to activate genes that code for improved mitochondrial function and antioxidant protection. Subsequent in vitro mouse studies showed activity of more than 12,000 genes and a preferential activation of genes within the brain with induction of more than 200% of 43 genes in the cortex and 13 genes in the hippocampus, including those genes that promote nerve cell growth, differentiation, regulation, and function, as well as increased mitochondrial activity and antioxidant protection.

This elegant research points to a new understanding of why and how herbal medicines or other specific nutrients can act to change our physiology, but also helps to explain why specific herbs or nutrients act preferentially on specific tissues or organ systems. It is not just Ginkgo biloba that acts to alter gene transcription. Every medicinal herb, every nutrient, every food, and every pharmaceutical medicine is likely to act in a similar fashion. These compounds do not just have a gross chemical effect on our physiology, but they actually alter gene expression. Nutrigenomics is forcing us to rethink the ways in which our bodies respond to environmental stimuli in the form of food, herbs, or medicine.

In some areas, nutrigenomics and nutrigenetics can overlap. We propose the use of the term “preventive genomics” to include both nutrigenomics and nutrigenetics because they cannot always be easily separated. To illustrate, PPAR-γ is a nuclear hormone receptor that regulates many cellular functions, such as nutrient metabolism, cell proliferation, and cell differentiation in response to dietary macronutrients, specifically to carbohydrate and fat intake. PPAR-γ integrates the cellular control of energy, lipid, and glucose homeostasis. Its activation by increased dietary fat or sugar intake is clearly an example of nutrigenomic interaction. However, there is a common polymorphism in the gene that codes for PPAR-γ in which at the twelfth amino acid in the protein, an alanine is substituted for a proline (the polymorphism is referred to as PPAR-γ P12A). Individuals with the 12A variant display a greater metabolic tolerance to a high-fat diet, leading to a significantly reduced risk of developing insulin resistance, type 2 diabetes, coronary artery disease, and central obesity when consuming a typical Western diet. Individuals with the 12P variant are more sensitive to the ill effects of excessive dietary saturated fat intake, suggesting a clear therapeutic dietary strategy in P allele carriers to prevent obesity, diabetes, and heart disease.

There are many polymorphisms like the PPAR-γ proline variation that exist in a large percentage of the population and appear to increase risk of certain serious diseases. They beg the question, why do these seemingly harmful polymorphisms exist? It is important to remember that the PPAR-γ proline variation is only harmful in individuals who eat a high-calorie, high-saturated fat typical Western diet. It behoves us to remember that in most of nature and for most of human history, too much food to eat was rarely a major risk factor. Quite the opposite was true when winter and famine were common occurrences. In these environments, the ability to extract more nutrition from the same caloric intake would be a distinct advantage for survival. It is only in the last 50 to 100 years in our culture of affluence that these variations have begun to pose significant risks to our health. We call these genes, “thrifty genes,” a term first coined by D.L. Coleman to explain why the Pima Indians from the desert of the American southwest were prone to develop obesity and diabetes. Thousands of years of survival in that harsh environment selected for genes that made this group incredibly efficient at retaining calories from food—a distinct adaptive advantage when food supply was scarce. However, with the 24-hour grocery only a car ride away, their genes are significantly less well adapted to survive. We see similar gene variations that increase inflammation throughout the body. This seems counterproductive until we realize that infectious disease has been a major environmental risk throughout evolution, and inflammation and immune activation are essentially the same biological process. It is imperative to remember that every polymorphism that exists in humans with significant prevalence confers protection and advantages to survival in some environment. Our task as clinicians is to identify that environment and to recommend it to those patients with that particular genetic variation.

The reality is that the prevention and cure of complex diseases and syndromes are not to be found exclusively in our genes or our environment, but in the interactive symphony between the two. Nature (our genes) provides a plastic template that is largely adaptable to a wide range of environments (“survival of the most adaptable”), and slight variations in those genes can cause altered responses to specific environments (nutrigenetics). In contrast, nurture (our environment) switches genes on and off, largely controlling gene expression (nutrigenomics).

To illustrate this idea of gene–environment interaction, consider the research of Caspi et al. They studied variations in the promoter sequence for the gene coding for monoamine oxidase-A (MAO-A) and found that a promoter polymorphism caused some people to have high-activity MAO-A genes and others to have low-activity genes. Those with high activity MAO-A would deactivate catecholamine neurotransmitters, like dopamine and noradrenaline, more rapidly. They then examined whether these genes played a role in antisocial and violent behaviour in men who had been abused as children. Remarkably, they found that men with the high-activity MAO-A gene were virtually immune to the effects of maltreatment as children, seldom if ever becoming violent offenders, whereas men with the low-activity MAO-A gene were much more antisocial and violent, but only if they themselves were abused as children. In other words, for violent behaviour to manifest in adulthood, both the low-activity gene (nature) and childhood maltreatment (nurture) needed to be present. If either was missing from the equation, the adult was very likely to be well socialized and nonviolent.

The Centres for Disease Control and Prevention (CDC) published a Gene-Environment Interaction Fact Sheet in August 2000 that outlined the basic principles of a broad understanding of the causal interaction of genes and environment in human disease. In it, the CDC makes the following four main points:

  • Virtually all human diseases result from the interaction of genetic susceptibility and modifiable environmental factors.
  • Variations in genetic makeup are associated with almost all disease.
  • Genetic variations do not cause disease but rather influence a person’s susceptibility to environmental factors.
  • Genetic information can be used to target interventions.

In this brief paper, the CDC essentially outlined the chief tenets of nutrigenetics and of preventive genomics.

Ironically, these ideas are hardly new. In 1909, Archibald Garrod published Inborn Errors of Metabolism, in which, after identifying the first human disease that behaved as a true Mendelian recessive trait (alkaptonuria), he went further to construct a sweeping hypothesis that altered heredity was “the seat of chemical individuality.” “Inborn errors of metabolism,” he wrote, “are due to a failure of a step in the metabolic sequence due to loss or malfunction of an enzyme.” By examining the subtle end products of metabolism, he continued, we should be able to identify the differences that altered heredity produces in each individual. This is a remarkable insight, given that the words gene and genetic did not exist in 1909, and it would be roughly 50 years before the structure and true function of DNA were confirmed. Moreover, Garrod’s book was published 3 years before the first vitamin was discovered, so he would have had no notion of vitamins as cofactors in enzymatic reactions. He concluded his prophetic work by envisioning the complex interaction between our unique genetic constitution and environmental factors in the exquisitely simple statement “These idiosyncrasies may be summed up in the proverbial saying that one man’s meat is another man’s poison.”

The point of using genetic and genomic information in a clinical setting is to personalize the therapeutic regimen and to develop an effective strategy towards true disease prevention. It is a common mistake, however, to think that somehow the new preventive genomic information we can access will make all previous therapies obsolete. This is evident in the mindset that thinks we can attribute disease risk to polymorphisms without reference to environment (read any genetic newspaper headline). Simply put, genetic information is no more and no less valuable than environmental information. We need them both to make an optimal difference. Furthermore, at this stage in preventive genomic research there are at best only 100 or so polymorphisms about which we have sufficient clinical information to make personalized nutritional recommendations. It is useful information, to be sure, but it is insufficient for comprehensive nutritional and therapeutic recommendations.

Carl Sagan once said, “If you want to make an apple pie from scratch, you must first create the universe.” Fortunately for making an apple pie, the universe has already been created, and fortunately for making comprehensive nutritional recommendations, natural selection has been active since the beginning of life on this planet. As humans, we and our ancestors have been adapting to environments for the last 3 billion years, if you want to think of all life, or for a mere 600 million years, if you want to think of eukaryotic cells with aerobic respiration. Either way, it is a lot of experimental trial and error that brought us to the present. The proper starting point of nutrigenomics and nutrigenetics is not genetics per se, but good epidemiology. Good epidemiology can tell us what is the best nutrition and lifestyle for the average person. Specific genetic polymorphisms can help us modify these recommendations to meet the specific genetic needs of an individual patient.

Discovering the optimal nutrition and lifestyle for a specific individual will depend on developing a functional matrix that is minimally composed of:

  1. Good epidemiologic dietary and lifestyle data
  2. Specific genomic polymorphisms that alter specific macronutrient and micronutrient requirements
  3. Functional laboratory assessment of individual function, physiology, and micronutrient status

In terms of Western diets, the Mediterranean diet is one of the best studied and, which, from an epidemiologic perspective, is arguably the best for disease prevention and optimal health. It has been demonstrated to reduce cholesterol and triglycerides, to prevent atherosclerosis and high blood pressure, to reduce the risk of senility, stroke, heart disease, insulin resistance, type 2 diabetes, and numerous cancers, including those of the breast, prostate, and colon. Among elderly Europeans, a Mediterranean diet combined with moderate daily activity and no smoking reduces all-cause mortality by 67%, and switching to this healthier diet and lifestyle at age 70 reduced mortality rates by 50%.

Clearly, these results are impressive, but the central tenet of preventive genomics is that these risks can be reduced further with the judicious application of nutrigenetic information gathered from the analysis of individual polymorphisms. Not all polymorphisms are clinically relevant. Most of our interindividual genetic variation occurs in sections of the DNA that do not actually code for proteins. About 97% of our DNA is either of viral origin, repeats, or simply junk DNA that codes for nothing. To make genomic information clinically useful, polymorphisms must meet four essential criteria: they must be relevant, prevalent, modifiable, and measurable.

  1. First, the only polymorphisms in the genome of clinical interest are those that exert an effect on some specific aspect of our biochemistry and physiology (relevant).
  2. Second, given our current knowledge of the human genome, only polymorphisms that exist in a substantial portion of a population are likely to be able to be demonstrated in epidemiologic and case-controlled studies to be clinically relevant—in essence, we compare polymorphisms that occur in substantial numbers in both groups (prevalent).
  3. Third, only polymorphisms whose genetic expression is modifiable via reasonable clinical intervention are clinically useful. Such an intervention may be any modification of environment, including diet, lifestyle, and targeted nutraceuticals or pharmaceuticals. Although genes themselves are not modifiable, the phenotype they generate is modifiable via environmental changes because environment turns genes on and off.
  4. Finally, because our genes do not themselves change, we must be able to measure changes in our functional physiology to determine that the environmental changes implemented have been effective in modifying the phenotypic or physiologic expression of our unique genes. For this purpose, functional testing must be available and used in conjunction with genomic testing for polymorphisms (measurable).

To illustrate this model of clinical utility, let us consider the example of the detoxification and antioxidant protection afforded by the enzyme glutathione-S-transferase (GST). High levels of toxins and oxidative stress are associated with numerous degenerative diseases and with the aging process itself. So an enzyme like GST that protects against such oxidative damage is highly relevant to our health. The GST polymorphisms have been associated with numerous cancers in epidemiologic and case-control studies. Although there are several isoforms of GST in the body, the µ isoform, GSTM1, is the most common in the liver. Fifty percent of the population does not have any copy of the GSTM1 gene, so clearly it is prevalent. Because other isoforms of GST and other antioxidant pathways exist, it is possible to improve antioxidant protection by maintaining a high reduction potential through exogenous avenues, including dietary antioxidants. Furthermore, men and women who lack the GSTM1 gene all together can increase their serum glutathione levels by 16% and 38%, respectively, and their other GST activity by 6% and 8%, respectively, simply by eating four or more servings of brassica vegetables weekly. Interestingly, individuals possessing a GSTM1 gene showed significantly less benefit. Similar benefits were seen from regular brassica vegetable consumption in reducing colorectal cancer by 53% and lung cancer in smokers by 70%, but again only in individuals lacking the GSTM1 gene. Thus, the increased oxidative stress is attenuated by regular brassica vegetable consumption. Finally, we can validate the clinical efficacy of our intervention strategy by measuring functional levels of oxidative stress through any number of simple laboratory tests, such as urine lipid peroxides, 8-hydroxy-deoxyguanosine, serum glutathione, urine, or plasma cysteine/cysteine ratio, etc. The biological effects of our interventions are measurable.

Currently, most of the clinically relevant nutrigenomic information available relates to the relationship between polymorphisms with increased chronic disease risk and the dietary therapies and supplements that have been demonstrated to treat the physiologic imbalance effectively. This is particularly true of SNPs. In as many as one third of genetic polymorphisms, the corresponding enzyme has a decreased binding affinity for a vitamin or mineral coenzyme, resulting in a lower rate of reaction and altered enzyme function. In a review article, Ames et al provided evidence of more than 50 human diseases involving defective enzymes that could be remedied or ameliorated by the administration of higher doses of vitamins or minerals, which at least partially restore enzyme activity. Thus, the clinical validity of high-dose nutrient therapy is established, but only in genetically susceptible individuals.

Cardiovascular disease accounts for approximately 40% of all deaths in most industrialized countries and is also responsible for significant morbidity and diminished quality of life. Furthermore, using data from more than 84,000 women who were monitored for 14 years, epidemiology researchers at Harvard University estimated that about 83% of cardiovascular disease could be decreased if everyone ate a reasonably healthy diet, exercised daily, did not smoke, maintained a normal weight, and drank 1 to 2 alcoholic beverages per day. Although these five diet and lifestyle changes seem fairly simple, only 3.1% of the women in the study actually adopted them. Not only is compliance an issue, but epidemiologic studies raise deeper philosophical questions, because such studies assume that all participants are essentially similar to one another. Are the same therapies equally effective for all individuals? Do all individuals need the same quantities of these diet and lifestyle modifications? Can preventive genomic testing help us individualize our therapeutic protocols?

Let us begin by asking what proportions of carbohydrate, protein, and fat in the diet are the most effective in maintaining normal serum cholesterol levels. Is it a low-fat, low-cholesterol diet as touted by Nathan Pritikin and Dean Ornish? Or perhaps a 40-30-30 (40% carbohydrate, 30% protein, 30% fat), low-calorie “Zone” diet as proposed by Barry Sears? Or a low-carbohydrate, high-protein, high-fat diet such as that made popular by Robert Atkins? Logically they cannot all be the best diet for everyone, yet some people swear by each of them as the answer for an optimal diet.

There is mounting evidence that optimal macronutrient proportions in the diet may be a function of specific polymorphisms. Apolipoprotein-E (Apo-E) is a molecule that mediates the interaction of chylomicron remnants and intermediate-density lipoprotein particles with lipoprotein receptors, including the low-density lipoprotein (LDL) receptor and the chylomicron remnant or Apo-E receptor. There are three variations of the Apo-E gene, known as Apo-E2, -E3, and -E4, and these polymorphic variants are important genetic modifiers of serum lipid responses, which consequently may significantly affect an individual’s risk for development of coronary artery disease. (Technically, each of these polymorphisms is a haplotype: two individual SNPs at different points in the Apo-E gene that are linked.) The E3/E3 genotype is the most common in all populations and serves as the benchmark genotype for comparison with any other possible genotype. The E4 allele is associated with a moderately increased risk of atherosclerosis and coronary artery disease (odds ratio [OR] = 1.53 for men and 1.99 for women in one study). The E2 allele is associated with lower LDL cholesterol levels. LDL cholesterol levels declined with each Apo-E2 allele by 8.8 mg/dL in Hispanics, by 25.6 mg/dL in non-Hispanic white persons, and by 18.1 mg/dL in African-Americans.

Given that each person inherits two alleles, there are six possible genotypes: E2/E2, E2/E3, E3/E3, E2/E4, E3/E4, and E4/E4. In patients with elevated serum total and LDL cholesterol, the cholesterol-lowering response to a low-fat, low-cholesterol diet increased as the sum of the allele numbers increased (in other words, response improved in the order that the genotypes are listed in the first sentence in this paragraph). Conversely, individuals carrying an E4 allele who ate a high-fat diet were much more likely to have elevated serum cholesterol.

In contrast, serum triglyceride levels were found to be significantly higher in men, and to a lesser extent in women who carried an E2 allele. Moreover, the triglyceride levels of individuals with an E2 allele showed a dose-dependent relationship to table sugar consumption, an association not shared with the other genetic variations. Dietary sucrose consumption also increased very low-density lipoprotein (VLDL) cholesterol and triglycerides in Apo-E2 carriers. In contrast, E3/E3 individuals were found to have the lowest triglyceride levels, and E4 carriers to have intermediate levels. In another multiethnic study, plasma triglyceride levels were inversely correlated with the number of Apo-E4 alleles (175, 159, and 143 mg/dL with 0, 1, and 2 alleles, respectively). Another study found E4 carriers to have elevations in triglycerides, but only those who consumed alcohol regularly. A study of Turkish men who did not consume alcohol found no elevated triglyceride values in E4 carriers. The most effective dietary therapy to lower triglycerides is restricting carbohydrate, especially sugar, consumption. Because Apo-E2 men are prone to significant triglyceride elevations and their levels are less sensitive to dietary fat intake, a lower carbohydrate, higher protein, higher fat diet may be therapeutically desirable for them.

Individuals with the E3/E3 genotype are moderately affected by dietary fat intake but tend to have lower triglyceride levels than the other genotypes. In one study, individuals with high cholesterol and triglyceride values were treated with a short-term (7-day) very low-calorie juice fast (208 calories/day), and their responses were stratified by Apo-E genotype. Only E3/E3 individuals experienced significant improvement in all parameters, experiencing reductions in LDL cholesterol by 10% and in triglyceride values by 18%. In these individuals, it is tempting to speculate that the low-calorie, moderate 40-30-30 carbohydrate-protein-fat ratio “Zone” diet might be more effective in treating both hypercholesterolemia and hypertriglyceridemia, although no clinical trials to validate this hypothesis have yet been published. Also, in this trial, E2 carriers experienced a 31% drop in LDL levels but a 15% increase in triglyceride levels, lending some additional support for the carbohydrate-sensitive triglyceride hypothesis. E4 carriers had an opposite response to E2 carriers: a dramatic reduction of triglyceride levels by 49% but a rise in LDL levels by 13%, while eating only 200 calories/day.

Other dietary interventions that affected cholesterol levels were fibre and alcohol. High intake of soluble fibre (5.7 g/day) reduced LDL cholesterol by 6.6% and 5.6%, respectively, in E3 and E4 allele carriers but had little effect on E2 carriers. Similarly, a study of Korean patients with coronary artery disease found that replacing white rice with whole grains produced the greatest benefit in E3/E3 individuals (12% reduction in triglyceride, 8% reduction in LDL cholesterol, and 8% increase in high-density lipoprotein [HDL] cholesterol levels), as well as moderate benefit in E4 carriers (including a 15% increase in HDL cholesterol values), but there was absolutely no effect on cholesterol levels in E2 carriers. Thus, a high-fibre diet may be clinically most useful for E3/E3 individuals, modestly useful for E4 carriers, but not useful for E2 carriers.

The modest consumption (1 to 2 drinks) of alcohol daily has been shown epidemiologically to be extremely protective against coronary artery disease in the general population. However, Apo-E genotyping reveals that the benefits are not the same for everyone. In a comparison of nondrinkers with drinkers of alcohol, women who were drinkers had lower cholesterol (total and LDL) than women who did not drink, regardless of Apo-E genotype. In men, however, Apo-E2 carriers who were drinkers had lower cholesterol but Apo-E4 carriers had higher cholesterol, and alcohol consumption had no significant effect for Apo-E3/E3 individuals.

It is worth observing that moderate exercise, although not dietary therapy, was found to be effective in improving serum lipids in E2 and E3 genotypes but not in E4 carriers. In another study, high-intensity physical exercise was most effective in E4 carriers. Thus, E2 and E3 carriers may benefit from modest daily exercise, but for an individual with an E4 allele to benefit from exercise, it must be high intensity.

Similarly, knowledge of Apo-E genotype may allow for a more discriminating and more effective use of targeted pharmaceuticals for patients with lipid abnormalities. Statin drugs exhibited a greater cholesterol-lowering effect in Apo-E2 carriers, followed by Apo-E3/E3, with less effectiveness for Apo-E4 carriers. Apo-E2 carriers also showed better response to gemfibrozil and cholestyramine than the other genotypes. The greatest cholesterol-lowering effects for Apo-E4 carriers came from probucol, a potent antioxidant. However, although statins had a weaker cholesterol-lowering effect in Apo-E4 carriers, statins were most protective in preventing a second heart attack in men with an E4 allele. Statin use reduced the risk of a second heart attack in E4 carriers by 64% compared with only 33% for other genotypes, despite the fact that statins reduced LDL cholesterol levels least in Apo-E4 genotypes.

More prospective clinical trials are needed to fully elucidate the optimal therapeutic dietary interventions to protect against coronary artery disease in individuals based on their Apo-E genotype status. However, distinct patterns are beginning to emerge in terms of dietary and lifestyle management for the treatment of coronary artery disease.

LIPID-LOWERING THERAPIES STRATIFIED ACCORDING TO APOLIPOPROTEIN-E (APO-E) GENOTYPE

THERAPY Genotype *
E2/2 or E2/E3 E3/E3 E4/E3 or E4/E4
Diet Lower carbohydrate Low-calorie, moderate-fat Low-fat, no cholesterol
Soluble fibre
Alcohol intake Daily (women and men) (little effect for men) Daily (women)
None (men)
Daily (women)
Exercise Moderate Moderate High-intensity
Pharmaceutical/nutraceutical Bile sequestrants
Statins
Statins Probucol
Statins

 The E2/E4 genotype is extremely rare and therefore lacks statistical power in both association studies and prospective trials.

Apo-E polymorphisms are the best studied in terms of their association with atherosclerosis and coronary artery disease, but other polymorphisms exert similar effects. Cholesteryl ester transfer protein (CETP) is responsible for the transfer of insoluble cholesteryl esters from HDL to other lipoproteins. The Taq1B polymorphisms of the CETP gene result in increased CETP levels, with impaired ability to remove cholesterol from the cells and the blood stream. The Taq1B polymorphism occurs with lower HDL cholesterol levels and an increased risk of development of atherosclerosis and coronary artery disease. In one study, individuals who were homozygous for the Taq1B polymorphism had larger reductions in LDL and VLDL levels through eating a low-fat diet with a high ratio of polyunsaturated to saturated dietary fat. Furthermore, daily moderate alcohol consumption raised HDL levels substantially, but only in individuals with the CETP polymorphism.

E-selectin (SELE) is a glycoprotein molecule that adheres circulating neutrophils to the endothelial lining of blood vessels. E-selectin is expressed on the surface of endothelial cells after stimulation by inflammatory mediators (mediated by NF-κβ). Several polymorphisms in the SELE gene increase adhesion activity of E-selectin, dramatically increasing the risk of atherosclerosis and premature coronary artery disease.

The primary therapeutic aim for carriers of a SELE polymorphism is to decrease NF-κβ stimulation, in turn reducing SELE expression. Maintaining high antioxidant potential has been shown to decrease NF-κβ activation. Thus, higher levels of dietary antioxidants and possibly supplemental antioxidants may be beneficial. The proinflammatory cytokines tumour necrosis factor-α and interleukin-1 also increase NF-κβ activation. Antioxidants, fish oil, and milk thistle (silymarin) supplementation have each been shown to suppress interleukin-1 and tumour necrosis factor-α production directly and would therefore lower NF-κβ activation.

Hypertension is an independent risk factor for coronary artery disease and for stroke. Therapeutic response to sodium restriction in hypertensive individuals is highly variable. A SNP in the angiotensinogen gene (AGT) allows an amino acid substitution in which threonine (T) replaces methionine (M) at amino acid 235. The T allele is associated with increased production of angiotensin, with a tendency for higher blood pressure. There is a stepwise increase of serum AGT from MM to MT to TT genotypes among persons with hypertension and those with normal blood pressure. Although there have been some discrepancies between studies, a meta-analysis of all studies published between 1992 and 1996 showed the 235T allele had a consistent, mild association with hypertension (OR = 1.20), a positive family history of hypertension (OR = 1.42), and more severe hypertension (OR = 1.34).

The 235T allele of the AGT gene is associated with greater blood pressure decreases than the 235M allele after an intervention to reduce sodium intake (>5 mg/day). Persons with the TT and MT genotypes showed significant systolic blood pressure reductions when consuming mineral salt compared with control subjects ( <0.02 and <0.001, respectively), but persons with the MM genotype did not ( <0.10). The net adjusted systolic and diastolic blood pressure reduction was –8.6 mm Hg systolic/–3.9 mm Hg diastolic for persons with the TT genotype, –9.0/–5.2 mm Hg for those with the MT genotype, and –5.3/–1.0 mm Hg for the MM genotype.

Aerobic exercise was effective in reducing blood pressure but only in the MM (–3.7 mm Hg) and MT (–3.4 mm Hg) genotypes, and not in the TT (–0.4 mm Hg) individuals, among 477 previously sedentary white Americans. Finally, the use of angiotensin-converting enzyme inhibitors produced more dramatic reductions in blood pressure in 235T allele carriers (TT and MT) than in the MM genotype, illustrating the notion that diverse modifications in “environment” (diet, exercise, targeted pharmaceuticals) may produce similar alterations in phenotype (in this case, blood pressure) among genetically susceptible individuals.

As mentioned previously, one common physiologic effect of SNPs is a decreased binding affinity of a vitamin or mineral coenzyme for the structurally altered enzyme. There is mounting evidence that in most cases, the diminished rate of reaction of that polymorphic enzyme can increase with high-dose cofactor micronutrient supplementation. Perhaps the best studied example of this model is the enzyme 5,10-methylenetetrahydrofolate reductase (MTHFR), which is responsible for re-methylating homocysteine into methionine, allowing methylation reactions to occur efficiently in the body. When methylation reactions are impaired, plasma homocysteine levels rise. Elevated homocysteine values have been associated with increased risk of atherosclerosis, risk of early coronary artery disease, increased risk of haemostasis and venous thrombosis, decreased bone density, neural tube defects, spina bifida, and acute leukaemia in adults. The risk of heart disease increases for heterozygotes as well as 677T/T homozygotes. Folic acid, vitamins B2 (as flavin adenine dinucleotide [FAD]), B , and B 12 are all cofactors in the methylation cycle, and a dietary deficiency of any of these vitamins can result in elevated homocysteine levels. However, elevated homocysteine levels can also result from two common polymorphisms in MTHFR: a 677C-T nucleotide substitution and a 1298A-C nucleotide substitution. Someone with a 677T/T genotype has a 50% reduction in MTHFR enzyme activity, whereas a 1298C/C genotype has a 32% reduction. In one study, approximately 28% of patients with elevated homocysteine did not respond to supplementation with folic acid, vitamin B 12 , and vitamin B because they had the 677T/T genotype and severely impaired MTHFR activity. It is worth noting that the 677C-T polymorphism occurs in the FAD site of MTHFR, suggesting that vitamin B supplementation may be a critical component.

Three rational therapeutic strategies may help to resolve elevated homocysteine levels. First, high-dose supplementation of the vitamin cofactors of MTHFR, which relies on the simple logic of the law of mass action in chemical reaction: higher concentrations of substrates drive the chemical reaction forward. Second, the metabolic products of the MTHFR enzyme, namely 5-methyltetrahydrofolate, may be supplemented. Third, alternate re-methylation pathways may be stimulated, namely, by supplementing trimethylglycine or betaine as alternate methyl group donors. In practice, because of the low cost and high degree of safety associated with each of these options, all three strategies may be employed simultaneously. Regardless, serial plasma homocysteine measurements allow the practitioner to determine that the therapy employed has been successful.

Although the science of nutrigenomics is still in its infancy, several principles have become clear with enormous potential impact for clinical medicine. Nutrients act as dietary signals that alter gene expression (nutrigenomics). In individuals with specific polymorphisms, their specific genetic variation may render that individual more or less sensitive to specific nutrients or environmental stimuli and, therefore, may help determine optimal diet and lifestyle (nutrigenetics). Knowing the effects of these environmental stimuli on gene expression allows us to utilize this information proactively to alter gene expression and to mitigate risk for developing various forms of chronic disease, like cardiovascular disease. These principles of nutrigenomics and nutrigenetics hold true for macronutrient balance in the diet, as well as for micronutrients, such as vitamins, minerals, and electrolytes. Although our primary focus here has been on nutrient–gene interactions, the same principles hold true for lifestyle changes and targeted pharmaceuticals. Nutritional intervention is merely a more specific application of environmental modification to alter gene expression. What should be clear is that as our knowledge of gene–gene and gene–environment interactions increases, so too will our capacity for delivery of increasingly personalized and primary prevention.

We use the term susceptibility gene to refer to a polymorphism that may render us more susceptible to development of a chronic disease when we are exposed to an adverse environment. Most susceptibility genes have a low positive predictive value (the probability that a disease will develop in a person with a positive test result) and a low attributable risk (the proportion of cases of a disease that can be attributed to a susceptibility gene). Therefore, some researchers have questioned the clinical utility of susceptibility genetic testing, but such arguments, by applying a single-gene/single-disease model to susceptibility genes, miss the potential clinical relevance. Susceptibility genes, much like taking a family history, must be seen as important but incomplete contributors in what is invariably a multifactorial risk assessment. In terms of health outcomes, preventive genomic polymorphisms raise risk modestly and are additive in their effects (gene–gene interactions), and their actual phenotype expression is strongly affected by diet, lifestyle, and environment (gene–environment interactions). Rather than negate their clinical utility because of a low positive predictive value and a low attributable risk, these polymorphisms begin to offer a molecular basis for understanding the pathophysiology of complex multifactorial diseases, and an understanding of the environmental factors that affect gene expression begins to evoke effective therapeutic strategies.

Because environment is broadly understood to refer to anything outside the genome itself, therapeutic regimens may be constructed to include any portion of the environment that has been shown to affect gene expression and phenotype. This is truly a holistic approach because effective therapeutic strategies may involve diet, nutritional, and targeted pharmaceutical supplementation; lifestyle and behavioural modification; and avoidance or elimination of toxins, xenobiotics, and microbes. Intervention at any level of our “environment” may prove clinically beneficial.

Functional medicine is the clinical discipline designed to promote health, to anticipate and prevent disease, or to correct an existing disease by improving physiologic function. The underlying assumption is that health and disease lie on the same continuum, and the connecting thread of the continuum is physiologic function. Before the manifestation of any frank disease, a progressive loss of homeostasis and increasing dysfunction occur. Clinical intervention in this strategy may begin at the earliest signs of imbalance. The promise of preventive genomics is that the point of effective intervention may begin even earlier, before the beginnings of physiologic dysfunction.

Preventive and nutritional genomic testing in clinical practice has its critics, who predominantly argue that there are insufficient numbers of clinical trials to demonstrate the diagnostic and therapeutic efficacy of genomic profiles. However, rather than demonstrating that all pregenomic testing is “premature and scientifically unsound,” as suggested recently by Dr. Muin Khoury of the CDC, such objections merely underscore the perspective shared here. Namely, it is precisely because chronic disease is multifactorial with complex gene–gene and gene–environment interactions that the diagnostic and therapeutic utility of preventive genomic testing must meet the four criteria of being relevant, prevalent, modifiable, and measurable. The last criterion, measurable, is critical at this early stage of preventive genomic testing. We must be able to measure beneficial phenotypic, physiologic changes in individuals that result from the therapeutic strategy employed. There will always remain a possibility that other genetic polymorphisms or other environmental influences that we have not yet identified may also affect an individual’s disease risk.

A similar cautionary argument should be made in cases in which current genomic knowledge offers conflicting therapeutic advice. A man with an Apo-E4 allele presumably should not drink alcohol, but what if he also has a CETP Taq1B polymorphism for which alcohol has been shown to boost HDL cholesterol levels dramatically? At present, we do not know the answer to this clinical conundrum. Fortunately, because fractionated lipid levels are easily measurable, the effects of moderate, daily alcohol intake on such a specific individual’s lipid profile may be easily ascertained. Thus, medicine must ultimately be empirical, relying on trial and error until the desired result (e.g., lower serum cholesterol) is achieved.

Genetic information, because it represents an unchangeable state, has the potential at least in theory to be used in a manner by insurers, employers, and society at large. However, it should be noted that the paradigm for such discrimination views genetic information as representing an individual’s immutable limitations, as typified by single-gene diseases (e.g., Tay-Sachs disease, Huntington’s disease, sickle cell anaemia).

A fundamental cornerstone of preventive genomics is the idea that the phenotypic outcome of any unique genotype is modifiable through environmental change. Preventive genomic testing, rather than revealing an individual’s genetic limitations, more accurately reflects an individual’s genetic potential, given the right environment. Genetic polymorphisms have the potential to guide an individual to adopt appropriate dietary, lifestyle, and environmental changes that can optimize health and longevity.

Now that we know there are irrefutable connections between genes and environment, and we are learning their myriad effects on health and disease, ignorance is no longer ethically neutral. Choosing not to use genomic information in treating patients, now that its health implications are being effectively documented, is ethically untenable. For the first time in the history of medicine, preventive genomic testing allows us to assess individual risk for development of chronic diseases, to develop comprehensive risk reduction strategies before imbalances in homeostasis occur, and to institute optimal therapy interventions for patients who are already sick. This new personalized medicine, made possible by the advent of preventive genomics and nutrigenomics, offers practitioners and patients new opportunities for the prevention and treatment of disease and for the promotion of optimal health.

Summary of genomics, nutrigenomics, and the path of personalised medicine:

Preventive genomic testing exploring both nutrigenomics and nutrigenetics is both new and exciting. It affords practitioners both novel and effective avenues to develop personalized therapeutic regimens and to promote optimal disease prevention strategies. Although what we know is dwarfed by what we do not know, this imbalance does not negate the current therapeutic power that the last 20 years of genomic research has revealed. We agree with the following assertions:

  • There are many examples of effective research on gene–environmental interactions.
  • There is sufficient evidence to make clinical recommendations on the basis of individualized genomic predisposition.
  • There is a need for research on traits that protect from, as well as predispose to, disease.
  • Environment modification, especially dietary changes, may be the easiest and most efficient way to influence the risks of many diseases common today.

It is little wonder that Paul Berg, the pioneering researcher in recombinant DNA and genetic engineering and the winner of the Nobel Prize for Chemistry in 1980, once quipped, “All disease is genetic even when it is also something else.” However, it is equally true to say that all disease is environmental even when it is also genetic. The symphony between nature and nurture is the very essence of life and health for all creatures who call this Earth their home.

TestimonialsWhat They Are Saying

KNOWLEDGE BASE
About Genomic Medicine UK

Genomic Medicine UK is the home of comprehensive genomic testing in London. Our consultant medical doctors work tirelessly to provide the highest standards of medical laboratory testing for personalised medical treatments, genomic risk assessments for common diseases and genomic risk assessment for cancers at an affordable cost for everybody. We use state-of-the-art modern technologies of next-generation sequencing and DNA chip microarray to provide all of our patients and partner doctors with a reliable, evidence-based, thorough and valuable medical service.

Leave a Reply

Your email address will not be published. Required fields are marked *

X