Abstract
Nutrient–gene research tends to focus on human disease, although such interactions are often a by-product of our evolutionary heritage. This review explores health in this context, reframing genetic variation/epigenetic phenomena linked to diet in the framework of our recent evolutionary past. This “Darwinian/evolutionary medicine” approach examines how diet helped us evolve among primates and to adapt (or fail to adapt) our metabolome to specific environmental conditions leading to major diseases of civilization. This review presents updated evidence from a diet–gene perspective, portraying discord that exists with respect to health and our overall nutritional, cultural, and activity patterns. While Darwinian theory goes beyond nutritional considerations, a significant component within this concept does relate to nutrition and the mismatch between genes, modern diet, obesogenic lifestyle, and health outcomes. The review argues that nutritional sciences should expand knowledge on the evolutionary connection between food and disease, assimilating it into clinical training with greater prominence.
Keywords
Introduction
The term anthropocene was first introduced in 2000. 1 The neologism refers to a period of rapid innovation beginning in the late 18th century, when industrial enterprise set in motion changes with global impact with respect to Earth’s biology, climate, human culture, and health. However, many anthropologists consider the anthropocene to have originated far earlier, when animal and plant domestication replaced hunter-gatherer lifestyles. 2 Most scientists now concur that this biocultural phenomenon comprises several stages extending back to the Neolithic.2,3 Its origins may be even more archaic; the discovery of fire 0.5 to 1.5 million years ago may be the true start of the anthropocene—a point in time that changed everything that followed. A putative marker of this is the smaller, weaker jaw and more compact dentition of subsequent hominins, indicative of reduced chewing, gnawing, and tearing commensurate with the expansion of cooking. 4 Cooking likely increased meat consumption and promoted encephalization along with evolution of a smaller gut, enhanced bipedal agility and anatomical remodeling of the hand for tool use. If the anthropocene is defined by intellectual capability to modify the world, rather than be modified by it—perhaps the pivotal factor—controlling fire—represents its true inception point.
This article examines how biocultural adaptations to new foods during the anthropocene influenced human evolution, and why some of these past adaptations to ancestral environments have led to a mismatch between ancient genes and contemporary diet and lifestyle, contributing to the diseases of civilization/affluence. This approach underscores Darwinian/evolutionary medicine, which applies modern evolutionary theory to understanding health and disease. To achieve this, the review follows how recent human dietary transitions have contributed to present-day disease burden, identifying a selection of the increasing number of genes that influence common disease phenotypes. It then examines nutritional health taking account of thrift, the developmental origins of adult disease, ageing and lifecycle complexity, and the extent to which we are bridging the gap between modern and more ancient diets.
It is important to recognize that Darwinian theory per se extends well beyond nutrition to include the evolution of resistance to antibiotics and cancer chemotherapeutics, through to the influence of hygiene in the atopies and autoimmune disease. Other major components of the discipline relate to virulence and immune evasion. Although the nature of our evolved defense mechanisms are critically important, this article limits itself to the role of nutrition in the etiology of the diseases of affluence, taking account of our genetic legacy within the recent anthropocene.
Nutritional and Lifestyle Transitions: The Origin of a Panoply of Phenotypes
As hunter-gatherers were replaced by sedentary communities that began domesticating plant and animal species (Einkorn and Emmer wheat, progenitor legumes, and rye in the Karacadag Mountains of SE Turkey for instance), the scene was set for the negative correlate of sedentary agropastoral living to develop—disease. The belief that transitioning from foraging to agrarian lifestyles—a process beginning about 11 000 years ago that promoted development of large settlements—could actually improve human health is now considered a myth. The opposite is in fact true: Animal domestication altered meat composition from wild game with increased saturated fats and cholesterol.
5
Plant domestication led to monotypic diets of reduced nutrient diversity. Densely settled communities in concert with animal domestication led to infectious diseases, including zoonoses, water stress, and sporadic, even dramatic food shortages.
6
Famine, disease, and parasites were a greater burden to farmers than they were to hunters and foragers, with evidence of reduced stature, lifespan, and higher rates of iron-deficiency anemia among crowded agrarian communities.
7
Pre-agrarian foragers were almost completely free from degenerative disorders like diabetes, with obesity virtually nonexistent. How can this be when such populations had higher animal food intake than do even contemporary Westernized nations that suffer so bitterly the so-called diseases of civilization? Clearly, a contradiction exists: Today we eat less meat than our hunter-gatherer ancestors but suffer more diet-related disease—why? Quite simply, “recent” domesticated meat is very different in nutritional composition to that experienced by our foraging ancestors. 5
A broad perspective is necessary in considering our recent past, as many factors decouple biological selection from reproductive fitness. Since the taming of fire, 3 major events stand out in this respect. The first is the agricultural revolution, an event possibly triggered by climatic change and a major food crisis. 8 By the end of the Pleistocene ice age, large prey animals declined as humans became more efficient predators leading to a lifestyle transition from hunter-gatherer to farmer.9–11 This occurred at multiple loci: 11 000 years BP in the Middle East, 9 000 years BP in SE Asia and New Guinea, 5 000 years BP in Sub-Saharan Africa, and 5 000 years BP in the New World.12,13 The implications of this altered diet and the consequent development of the diseases of civilization is, at least in part, the raison d’être for Darwinian/evolutionary medicine, which has led to a growing corpus of research focusing on the evolutionary discordance hypothesis and what this means for the contemporary nutrition crisis. We can no longer ignore that this must be taken account of in meeting longer-term nutritional and health challenges.
The second transition took place more recently with even greater impact. The Industrial Revolution offered foods never encountered in our evolutionary past, making these available to the masses. These include refined sugars, cereals, and vegetable oils. As an example, industrial-scale wheat flour processing led to a loss of micronutrients, particularly folate and vitamin B6. Low folate/B6 levels elevate vasculotoxic homocysteine, which contributed massively to late 20th century cardiovascular disease mortality.14–17 Recent mandatory folate fortification decreased cardiovascular disease and neural tube defects, but would seem to increase other diseases, 18 illustrating how the nature versus nurture equilibrium is a complex balance. The significance of the causal relationship between folate, homocysteine, and cardiovascular disease is given in a 1995 article suggesting mandatory folate fortification could prevent 50 000 US coronary artery disease deaths annually. 19 In 1998, fortification was implemented in the United States.
Postindustrial dairy products and refined foods represent 72.1% of total energy consumed in the United States, but would at most have been a trace contributor to the pre-agricultural hominin diet. 20 The popularity of processed and junk foods along with deviation from diurnal rhythms in terms of eating patterns plays to the third major transition—development of the obesogenic environment—a blending of unnatural dietary practices, thrifty genes, lifestyle choices, and digital technology. This is a recent phenomenon characterized by changing anthropometric indices, particularly greater central adiposity in developed nations. 21 Similarly, unregulated dietary supplements can lead to a failing of nurture over nature.18,22,23 It is important to recognize the human phenome is being shifted by altered cultural norms, particularly head on confrontation between the obesogenic environment and thrifty genes that are thought to have evolved to survive famine. However, one should also consider other factors decoupling biological selection from reproductive fitness. Smaller families in the developed world along with a trend toward starting a family at an older age conspire with the increasing need for in vitro fertilization to challenge natural selection and promote ever greater infertility.24,25 Equally, since the early 1990s, our individual and connective worlds have been increasingly viewed through an expanding virtual interface, creating an extended anthropogenic phenotype that promotes a sedentary lifestyle. Personal computers and handheld mobile devices allow disengagement from the physical world, reduce caloric expenditure, and divorce us from our necessary connection with nature. So, an important component within the third event/transition is the very recent move toward an inactive lifestyle dictated by the information age.
Interpreting the data
As compelling as some nutrient–gene–disease relationships are when considered in an evolutionary and/or lifestyle context, interpretations are often fairly speculative in their nature. Therefore, in considering our dietary history through the anthropocene, it is crucial to recognize certain limitations, and acknowledge the need for some hard questions to be asked: How reliable can we be in reconstructing past nutritional environments that were unquestionably heterogeneous in time and space? Under exactly what circumstances can we reliably conclude that nutrient-related genes are mismatched to contemporary diets, and are there any signatures that point to selection? Where signatures do exist, are they consistent between populations according to what we know about historical dietary practices? How do alternative explanations for disease conditions, attributed in this article to a dietary mismatch, stack up; that is, can they be reliably rejected? The complexity and diversity of this topic make such questions, as valid as they are, difficult, if not impossible, to categorically address. One only has to consider one particular line of enquiry indicating that obesity may not be just a modern phenomenon, and which highlights problems of interpretation; Jozsa 26 showed that 51 of 97 female statues derived from the upper Paeolithic period reflected obesity—few (7) were pregnant suggesting these Venus figurines did not represent female fertility as is often thought to be the case; rather, they may have symbolized hopes for survival and longevity in well-nourished and hence reproductively successful populations during the last ice age. 27 In this sense they may neither reflect fertility nor do they necessarily reflect the overall population anthropometrics. So, how can we draw meaningful interpretations, and hence provide an unbiased argument one way or the other? In many cases like this, we simply cannot. Even many association studies that chart genetic elements as modifying dietary or other lifestyle effects in respect of disease occurrence can provide only preliminary insight, and specious results are possible, particularly in retrospective assessments or cross-sectional studies.
The need for a tighter hold on cause and effect in understanding the complexity of diet and human disease drives a need to supply technological solutions. With advances in metabolomics, proteomics, transcriptomics, and epigenomics through the key umbrella discipline of nutrigenomics (the “omics” revolution should help achieve this by supplementing our existing traditional view of nutrition).
Diet and Disease: Fast Foods, Slow Genes
In addressing the mismatch between “ancient” genes and changing dietary patterns in the context of health, it is worth noting that diets since the advent of agriculture and the Neolithic revolution do little to impair human reproductive fitness. There may be physiologic perturbations at the height of the optimal reproductive phase, but these do little to affect overall species fitness. They do, however, influence senescence and risk for degenerative disorders from cancers and vascular disease to diabetes, osteoporosis, and even dental caries—phenomena largely beyond the process of natural selection. Examples are dealt with later where nutrient availability alters fertility, and it is well established that obesity per se reduces fertility, but the focus here is largely what dietary changes have led to increased morbidity and mortality since man became “civilized.”
Extant (and historical) forager populations have a dietary intake (proportion of calories) of around 75% from animal foods, with extremes of 33% (!Kung) and 99% (Arctic Inuit). 28 The !Kung are considered outliers not typical of foragers in general. Despite such populations having a generally high intake of calories from animal foods, composition differs from contemporary sources of domesticated animal foods in being lower in saturated and monounsaturated fatty acids and higher in beneficial polyunsaturated fatty acids. Wild game meat is lean and protein dense, providing only around 50% of energy as fat, unlike domesticated grain-fed cattle. Such cattle have been selectively bred to have tastier meat, but at a price to health—excess calories, saturated fats, cholesterol, increased omega-6 polyunsaturated fatty acids relative to omega-3 polyunsaturated fatty acids all contribute to disease burden. In 2012, a prospective study of 121 342 participants concluded red meat consumption is associated with increased risk of total, cardiovascular disease, and cancer mortality. The pooled hazard ratio and 95% confidence interval of total mortality for a single serving per day increase was 1.13 (1.07-1.20) for unprocessed red meat and 1.20 (1.15-1.24) for processed red meat. The corresponding hazard ratios were 1.18 (1.13-1.23) and 1.21 (1.13-1.31) for cardiovascular disease mortality and 1.10 (1.06-1.14) and 1.16 (1.09-1.23) for cancer mortality. 29
Add to this, that of the macronutrients, protein not carbohydrate or fat affords greatest impact on satiety because of our inability to store protein, and it is clear that energy-dense, low-protein diets do not satiate; they simply add calories. Such diets are typical of contemporary and recent dietary patterns and overload the provision made by evolution in which there was a loss in gut complexity most likely due to a higher quality diet. Such higher quality ancient diets may have driven encephalization through Homo ergaster, H erectus, and on to H sapiens.
Modern diets additionally differ from ancestral ones by their elevated glycemic load, reduced fiber, increased trans-fatty acids, lower micronutrient content, and an acid–base balance that is, today, typically net acid-producing unlike the Paleolithic diet, which was net base-producing due to high vegetable and fruit intake. 20 These, along with the altered fatty acid and general macronutrient profiles alluded to above, conspire with a raft of relatively new food sources such as dairy products, cereals, refined vegetable oils (including ones with nonbeneficial fatty acid profiles such as soy, corn, palm, and coconut oils), sucrose, and ethanol, to contribute to the degenerative diseases of civilization. While nutritional transition since the Paleolithic is a major factor in increased rates of obesity, type 2 diabetes, cardiovascular disease, cancer, and other disorders, the best definition of the cause of these diet-related disease phenotypes is that our metabolic homeostasis is mismatched to contemporary energetic and nutritional environments. This evolutionary discord is expressed as a range of disease phenotypes with important biological, medical, and socioeconomic implications for the world’s developed nations. Clearly, it is increasingly important to take account of our evolutionary history to meet present and future nutritional challenges. 8
Nutrigenetic Solutions and Problems
Various evolutionary adaptations, through selection of specific genetic variants, allow us to use new food sources (eg, the lactose tolerance gene and dairy culture coevolution 30 —also see Table 1). In contrast, many gene variants conferring survival advantage in natural environments create problems within the framework of contemporary Western diets, for example, “thrifty” genes that conferred ability to survive resource bottlenecks during periods of famine. In fact, we now know that cultural practices, powerful selection pressures, and genetic drift have conspired to rapidly alter around 700 regions within our genome in our recent past (between 5000 and 15 000 years BP).31–34
Selected Examples of Ostensibly Beneficial Nutrient-Related Genes Associated With, or Possibly Associated With, Novel or Changed Food Sources.
Looking at Tables 1 and 2, a hazy line divides nutrient-related genes associated with beneficial outcomes, with those linked to adverse phenotypes. Indeed, the same nutrient–gene interaction can have altered effect depending on stage of lifecycle and/or prevailing environmental conditions (nutrient availability and/or wider related factors). Indeed, diet also undoubtedly has a socially learned component that sets diet apart from nutrigenetic aspects. Medicine seldom looks at disease from an evolutionary perspective, despite recognition from the highest level that this should be the case and that Darwinian/evolutionary theory should be established as an important component of undergraduate medical programs.
96
This novel discipline allows us to take a more appropriately focused perspective in asking why disease actually exists. While the answer could be discussed at length, the outcome is always going to be relatively simple: Natural selection is slow: This allows for a mismatch between our body and novel environments to occur. Natural selection is constrained: Any given trait is a likely trade off. There is general confusion about human biology: It was widely accepted that organisms are subject to r/K selection to promote success in particular environments, shaping them for a trade-off between quantity and quality of offspring, not health defenses and human suffering. It is important to recognize that the r/K paradigm has now evolved into more rigorous ideas on the evolution of life histories, and these are discussed elsewhere by Reznick et al.
97
The bottom line is that noncommunicable metabolic disease has not been forged by natural selection, but by human vulnerability to such disease. This means natural selection is as much about maladaptation as adaptation. With these points in mind, and now having significant control over our environment, it is often asked whether human evolution has stalled. This question is highly relevant: In developed countries no energetic limitations on reproduction exist; health care provision makes selection advantage/disadvantage of genotypes far less critical; populations are no longer isolated and gene flow is therefore high. Despite this apparent stall, humans still need to respond to infection, and this remains a major target for evolutionary processes.
Thrift
Obesity and Diabetes: The Established View
Recent reexamination of the concept of thrifty genes, phenotypes, and norms implicates these phenomena as metabolic traits underpinning frugality in expenditure or deposition of energy. 98 However, while these traits emerged during Homo evolution to buffer reproduction as a countermeasure to ecological stochasticity, actions are mediated at different timescales. Thrifty genes are embedded in our distant evolutionary past, while some thrifty phenotypes are set over a single or very recent number of lifecycles.
Comorbidities of contemporary obesity may be a vestige of variable adiposity selected for via genotypes conferring survival advantage within certain ecological contexts (maladaptive consequences of highly adaptive phenotypes), for example, where food availability is unpredictable such as remote oceanic islands. Type 2 diabetes is a growing global problem, but in some populations the incidence of type 2 diabetes is extraordinarily high. On Nauru, a Micronesian island, around 30% to 40% of islanders older than 15 years have the disease. Historically, islander life was harsh, and ability to rapidly build up fat at times of plenty to survive the resource bottleneck of famine will have been hugely advantageous. Genes conferring this survival trait under ancestral conditions do not respond well to typical Western diets. We know this because Nauru found prosperity through its guano deposits, as a consequence of which islanders now have a sedentary lifestyle and calorie-rich diet completely mismatched to their ancestral genes. This promotes the high incidence of type 2 diabetes, the leading cause of nonaccidental death. Genes related to insulin resistance/type 2 diabetes in Nauruans include the Pro1019Pro leptin receptor and apolipoprotein D Taq1 polymorphisms.99,100
The “thrifty genotype” hypothesis was actually first put forward to explain modern man’s widespread susceptibility to diabetes (type 1 and 2 diabetes had not yet been differentiated). 101 One seldom reads of “unthrifty” genes, but they do exist (see Table 1 and 2). The Pro12Ala polymorphism in PPARγ described in Table 2 is likely to be such an example as it protects against type 2 diabetes in several ethnic groups.68,80,102
Selected Examples of Nutrition-Related Genes Associated With Obesity Phenotypes and/or Disease Risk.
It is interesting to consider changing patterns of adiposity over the lifecycle. High at birth and during infancy, adiposity declines through early childhood only to manifest itself in a sexually dimorphic fashion during the reproductive years. This sexual disparity in adiposity converges in a reductive manner during senescence. This early life phenotypic pattern for adiposity in H sapiens is a consequence of selection pressures arising from the need to tolerate increasingly marginal stochastic environments (ie, those with unpredictable food sources). The large human brain requires high levels of energy, particularly the infant brain that uses adiposity to buffer perturbations in energy supply. The same is true for mothers who need to supply all infant energy requirements till weaning begins. This occurs for 6 months in Western populations, but up to 3 years in hunter-gatherer populations, where breast milk likely buffers against unreliable food sources. 103 This is well illustrated by cerebral energy requirement (resting metabolic rate), which in humans drops from 85% to 45% to 20% in the transition from neonate to 5 year old to adult, respectively. 104
Bouchard postulated a hierarchical model involving 4 affecters explaining the present epidemic of human obesity. 105 He identifies built and social environment, behavior, and biology as key factors. Overall, these promote an obesogenic environment favoring obesogenic behavior. However, genetic variation contributes greatly to an individual’s predisposition to obesity, favoring positive energy balance and weight gain. The same author identified 127 genes associated with obesity. 106 Of these, 22 were supported by at least 5 positive studies and include ACE, ADIPOQ, ADRA2B, ADRA3B, DRD2, GNB3, HTR2C, IL6, INS, LDLR, LEP, LEPR, LIPE, MC4R, NR3C1, PPARγ, RETN, TNFA, UCP1, UCP2, UCP3, and VDR. When these were aligned against biologic traits favoring sustained energy balance, Bouchard identified 5 genotypic classes. This includes not only a thrifty genotype but also hyperphagic, sedentary, low lipid oxidizing, and adipogenic genotypes. These obesity promoting genotypes are not mutually exclusive in the context of these 5 categories, and ultimately obesity phenotypes likely have epigenetic components contributing as well. More recently, genotyping 20 125 individuals identified 12 SNPs that had significant effect on body mass index, including BCDIN3D, BDNF, ETV5, FTO, GNPDA2, KCTD15, MC4R, MTCH2, NEGR1, SEC16B, SH2B1, and TMEM18. 107 Of these, FTO and TMEM18 stood out for increased obesity risk.
Contrasting with thrifty genes that emerge over relatively long timeframes, thrifty phenotypes represent short-term adaptation to shifting ecological context. In this sense, the early embryo is influenced by nutritional environment in a way that can affect gestational survival. Nutrient levels mediate this via changes to the developing physiological system. This adaptive plasticity tunes the embryo/fetus to cope with in utero nutritional stress, but subsequently leads to adverse metabolic consequences during adulthood in the face of a lifestyle (ecological context) with nutritional excess. 108 Such adaptive plasticity continues into infancy. 109 This should also be considered in light of calorie restricted diets (in adults) and decreasing mortality across many organisms from nematodes to mammals110,111: while nutritional cues such as undernutrition during early life can have adverse effects later in life, eating less later in life can actually extend the lifespan. The influence of nutrition in utero and early in the lifecycle is the area of “Developmental Origins” and is discussed below as a major subsection.
Another concept to consider is “bet hedging.” This represents favoring genetic variability because it distributes risk across offspring. It is suggested that this phenomenon is distributed across several genes encoding proteins involved in behavior and cell metabolism and that these favor heritable adiposity traits, even in the absence of highly selective pressures that favor thrift. 98 Genetic variability as a consequence of bet hedging fits a model where high levels of developmental plasticity and constrained genetic adaptation may be indicative of our colonizing ability, that is, adiposity as an adaption underscored survival in new, nutritionally unstable environments. As a concept, bet hedging is most usually applied to the distribution of reproductive effort across reproductive events, but the application as presented here 98 is both interesting and valid.
Given that thrift is an increasingly debated concept within evolutionary paradigms describing human adiposity, and is particularly evident in neonates, during infancy, and in reproducing females, the best conclusion is probably that thrift is a consequence of our energetically expensive brain and the resource variable ecosystems we occupy. However, as stated by Wells, 98 contemporary variation in adiposity reflects genetic, life history, and behavioral variables. Expression of thrifty traits emerges via complementary pathways, but in a way that shows heterogeneity of the obesity phenotype between individuals. Wells takes this to imply a range of different treatments are likely to be required in addressing the condition. 98 The thrifty gene hypothesis is both strongly supported and sometimes contested, but certainly cannot be ignored in the context of nutrition, obesity, and many contemporary comorbid diseases.
New Views on the Acquisition of Thrift
An important view is that thrifty genotypes are not solely related to obesity and attendant comorbidities in today’s largely obesogenic environment. The definition of thrift is “economy and frugality.” Extending this definition into a biologic realm allows it to transcend traits most often associated with positive energy balance and weight gain. An example of a thrifty gene unrelated to obesity is MTHFR. The 677T-MTHFR allele enhances fidelity of DNA (dTMP) synthesis when dietary/cellular folate is depleted. 112 This arises because the polymorphic enzyme promotes a bottleneck in biosynthesis of methionine, redirecting 1-carbon units to DNA production. Thus, by allowing DNA synthesis to proceed with lower folate, it could be argued that MTHFR is a thrifty gene. Like obesity-related thrifty genes, there are negative correlates that one needs to be aware of. The 677T-MTHFR allele offers enhanced DNA fidelity when folate is low, but also raises homocysteine enhancing risk for cardiovascular disease later in life. In this sense, it offers advantage early in the lifecycle, but disadvantage later. Where selection favors genes conferring short-term benefit to the organism at the cost of enhanced deterioration later during the senescence phase, it is referred to as antagonistic pleiotropy. 113
From a Darwinian/evolutionary medicine perspective, MTHFR provides a good example of how a gene that increases homocysteine because of a poor dietary intake of folate, such as occurred through much of the latter half of the 20th century due to overprocessed wheat grain, led to increased rates of cardiovascular disease and possibly neural tube defect. Subsequent targeted pharmacologic and population measures to increase folate removed the negative modulatory role of MTHFR on these and related clinical phenotypes.
Perspective is important; foraging/food/nutrition is an important thread within the construct of Darwinian theory. It is interesting to rationalize Alzheimer’s disease in the context of nutrient acquisition and thrifty genes. It is the fifth leading cause of death in those older than 65 in the United States, with an economic burden making it the third costliest illness following heart disease and cancer. 114 Converging evidence supports a paradigm where cognitive decline and Alzheimer’s disease developed via slow, insidious changes to brain tissue over the course of the human lifespan, and which reflect an adaptive “metabolism reduction” program. 115 This clinically and economically relevant phenotype in today’s world is possibly an outcome of the decreased energy requirement needed by our hunter-gatherer ancestors when they reached their 30s, 40s, and 50s. In ancient populations as in extant ones today, foraging ability declines dramatically more than a decade before the average age of death. Reser posits that because of this, the huge metabolic liability of the brain must have been “tempered” by the typical early molecular changes characteristic of Alzheimer’s disease, and which accumulate in all humans from early adulthood. 115 In primal environments, ancestral mortality ensued before these cerebral “metabolism reduction” cellular processes could precipitate clinical Alzheimer’s disease. In other words, selection pressures did not (indeed could not) operate to prevent these mid-lifecycle adaptive changes from progressing to the maladaptive Alzheimer’s disease phenotype. This fits with observations that during childhood and into adulthood, brain metabolic rate declines linearly with age, reflecting the fact children have more new information to learn and that Alzheimer’s disease is simply a continuation of this trend in brain metabolic decline.
While the evolutionary/foraging perspective on Alzheimer’s disease is interesting, mechanisms remain unclear. Although thrifty genes such as the APOE4 allele offer early life advantages in foraging where food sources are unreliable,83,116 they also contribute to increased risk for Alzheimer’s disease later in life—a further example of antagonistic pleiotropy 117 that is paying the piper later in life to offset earlier benefits. Of course, early life advantages of APOE4 benefitted ancestral humans, while the increased risk of Alzheimer’s disease and other degenerative diseases were unlikely to be relevant to them, but are relevant in today’s obesogenic world. Apolipoprotein E transports lipoproteins, cholesterol, and lipid-soluble vitamins into lymph and blood. Allelic variants of the APOE gene alter physiologic function with the E4 variant being less effective at degrading amyloid in Alzheimer’s disease; the allele geographically tracks other major thrifty genes involving insulin resistance and adiposity, but also augments steroid synthesis improving fertility over the E2 variant. 83 In the case of Alzheimer’s disease, diets need to be tailored to accommodate contemporary understanding of nutritional risk factors, for example, in the context of dietary lipids, cholesterol, and blood homocysteine.
Developmental Origins of Health and Disease
Developmental origins deal with association between nutritional conditions in utero and later health outcomes. Ellison has pointed out that applying insights in evolutionary biology to developmental origins highlights the value of the new field of evolutionary medicine. 118
Maternal undernutrition during pregnancy and/or impaired nutrient transfer to the embryo/fetus leads to metabolic adaptations enhancing immediate survival, but since changes become fixed, they can alter liver and muscle structure and hormone receptor density. Hale and Barker119,120 suggest that such changes remain advantageous in the long term if nutrition is frugal, 121 but become deleterious in an obesogenic environment.119,120,122–125
Research now indicates many chronic adult diseases have a long latent period originating from early life events involving developmental plasticity as an adaptive survival mechanism. Such developmental plasticity is underpinned by epigenetic phenomena such as DNA methylation and other histone modifications, which control gene expression by remodeling of chromatin. In this way, both genome and epigenome conspire to shape phenotype. Even monozygotic twins with the same DNA diverge from birth as a consequence of epigenetic modifications due to factors such as, by way of example, nutrient, xenobiotic, toxin, and ultraviolet exposure. Epidemiologic evidence shows that small size or relative thinness at birth and through infancy correlates with cardiovascular disease, stroke, type 2 diabetes, obesity, metabolic syndrome, and osteoporosis in adulthood. 108 An example of this is the association between adult lumbar-spine bone mineral density and vitamin D receptor genotype when analyzed by birthweight. 126 It implies adult bone mineral might be modified by undernutrition in utero, 108 and is further supported by a large twins study that showed concordance between birth weight and bone mass implicating intrauterine environment as well as heredity. 127 Epigenetics are thought to be important in mediating the effect of maternal lifestyle, body build, and vitamin D receptor on offspring bone mass and has been reviewed by Goodfellow et al. 128 Excess energy supply early in the lifecycle also has negative correlates later in life. Maternal hyperglycemia leads to fetal hyperinsulinemia and increased fat deposition. Studies suggest offspring of diabetic or obese women are at increased risk of metabolic complications even early in the lifespan.129,130 Clearly, as with many aspects of nutritional health, the relationship between fetal nutrition and hence birth weight and subsequent metabolic complications later in life is U shaped. Too little or too much nutrition has negative impact. The same seems true for infants with higher energy intake due to formula feeding compared with breast feeding. Such infants exhibit higher rates of obesity as adults. 131
By understanding various candidate in utero mechanisms underscoring the developmental origins of adult disease, such as indelible change to organ structure, programmed changes to gene expression via the epigenome, and altered cellular ageing, intervention strategies during pregnancy that deflect future disease may be possible. 132 Such interventions are limited in scope due to our knowledge base and safety issues, but several compounds have been examined, including folate, taurine, and antioxidants.133–135
Ageing and Lifecycle Complexity
While there are many theories on ageing, the developmental origin of adult disease provides an elegant link between diet in pregnancy and infancy and subsequent health outcomes that influence mortality. Other theories exist that link nutrition with age-related health correlates such as longevity and include antagonistic pleiotropy, 117 as described above. This theory supports a “pay later” model where a gene offers advantage early in life but detriment later. Another attractive rendering of this model is given by the disposable soma theory of Kirkwood and Holliday. 136 This predicts ageing is facilitated by accrued cellular and DNA damage over the lifespan and that faced with finite resources, the more an organism expends on maintaining somatic homeostasis, the less it can expend on reproductive processes (and vice versa). In other words, focus is on maintaining the germ-line across generations, but sustaining the soma within a single generation, such that an appropriate balance is struck between reproductive imperative and somatic maintenance and repair. Failure in the latter due to evolved gene limitations results in disease and ageing. Nutrition is likely important within this paradigm as calorie restriction may support somatic maintenance and slow ageing via coordinate upregulation of the stress response system. 137
Genes involved in central aspects of metabolic regulation, especially energy balance, are crucial to ageing. Many have been discussed earlier, but insulin signaling appears to be particularly important. Recent work on age-dependent gene expression across tissues has been examined in humans and mice. Although the motif most strongly predicted to regulate ageing seems to be nuclear transcription factor kappa-B,138–140 there are still many direct modulatory effects of nutrients on the lifecycle that, although latent, influence disease and ageing: it has been that shown deficiency in any of several micronutrients, which include folate, vitamins B12, B6, B3, C, E, iron, or zinc, mimic the damaging effect of radiation on DNA causing single- and double-strand breaks, oxidative lesions, or both, 141 while vitamins B12, B6, and folate are also critical for maintaining the methylome.142,143 Additionally, higher vitamin D concentration is associated with longer telomere length, underscoring potential benefits of this hormone on ageing and age-related diseases. 144 Vitamin B3 as NAD is required for ADP-ribosylation of proteins and poly-ADP-ribosylation of nucleoproteins needed in DNA repair in response to strand breakage. Energy-dense, nutrient-poor diets would undoubtedly drive this kind of “nutrient–ageing” equilibrium in the wrong direction. While relevant to Darwinian medicine, this generally latent effect of micronutrients on genomic and methylomic integrity might extend beyond an individual’s lifespan and conventional wisdom on epigenetic programming to the very earliest phase of the human lifecycle—conception. Folate is an important determinant of phenotype throughout the lifecycle; recent reports indicate periconceptional ultraviolet exposure (photoperiod and quanta) predicts folate-sensitive, epigenomic-related neonatal genotypes during the first trimester of pregnancy, suggesting humans may be tethered to diet in ever more interesting ways, reflecting a signature of the seasonal rhythms that were once important to human survival.145–150
While micronutrients have an important place in the construct of Darwinian theory, so do hormones. A recent study provides compelling evidence that male sex hormones decrease longevity. 151 It is therefore interesting that clinical phenotype may be influenced by physiological modulators of steroid hormone action such as vitamin B6, which regulates transcriptional activation by multiple classes of steroid hormone receptors. 152 In this sense, B6 status may be important in the pathoetiology of hormone-sensitive tumors such as breast cancer—a low B6 might diminish suppression of nuclear response element activation. As alluded to earlier, industrial-scale processing of flour was effective in stripping both folate and B6 from this staple, and along with elevating homocysteine, a factor in cardiovascular disease, may also conceivably have had an influence on cancers of reproductive tissues.
Bridging the Gap Between Ancestral and Contemporary Diets
The chasm between ancestral and contemporary environment is now being bridged to a certain extent by improved nutritional education/public awareness and government-mandated fortification programs. These have improved intake of crucial micronutrients such as folate, B12, B6, B3, and others, but some nutrients remain a concern, particularly vitamin D. Of 15 micronutrients examined by food frequency questionnaire in an elderly Australian population (n = 229), 153 only calcium, vitamin A, and vitamin D total dietary intake was below 100% of the recommended dietary allowance, with intakes of 89%, 73%, and 24%, respectively (see Table 3).
Average Micronutrient Intake for an Elderly Australian Retirement Village Population (n = 229) Obtained Using a Standard Food Frequency Questionnaire as Previously Reported. 153
Abbreviation: RDA, recommended dietary allowance.
a Values show both total dietary intake and intake from supplements expressed as a percentage of the RDA (figures based on RDA for a 70-year-old female). Supplemental intakes typically bridge any gap between food intake and requirement very well, except for vitamin D, which appears to be a problem micronutrient with total intake being only 24% of the RDA.
Clearly, Table 3 also shows supplemental intakes are a significant component of daily values, particularly for vitamins B1, B2, B6, and B12. This certainly addresses the issue of low B12 and B6 intakes in the elderly, which has been a concern for many years, and is especially important for vitamin B12 where low intakes can contribute to the development of pernicious anemia. The major problem in the list of nutrients examined here is for vitamin D, which remains low at an average intake that only achieves 24% of the recommended dietary allowance. This is additionally problematic in the elderly as there is a reduced ability to synthesize calciol from 7-dehydrocholesterol in the skin as we age. Since the main function of vitamin D is control of calcium homeostasis, a reduction in calcitriol may modulate healthy ageing. 23
Therefore, while the fiber, micronutrient, and polyphenolic antioxidant potential of our ancient Paleolithic diet was up to 3 times that of our contemporary Western diet,154,155 we are redressing deficiencies introduced by industrial-scale food processing, although a discordance still exists with respect to overall nutritional, cultural, and activity patterns. 20
One of the more recent developments in the field of nutrition is the concept of functional foods, that is, food components that afford demonstrable physiological benefits or that can minimize risk of chronic disease, above and beyond their basic nutritional functions. The principle of foods for healthy living extends back to Hippocrates, who famously articulated his “food as medicine” philosophy 2500 years ago. By the industrial revolution, his sagely words had fallen into relative obscurity. While the first half of the 20th century was a time of great discovery, with the essential micronutrients being uncovered and deficiency syndromes being addressed, it was not really until the 1970s that Western countries had a paradigm shift in the relationship between diet and health, in which the emphasis moved from undernutrition to overnutrition and disease causation. With an increasingly ageing, health conscious population, coupled to advances in technology, medicine, and nutritional science, the 1990s saw the development of functional foods and nutraceuticals as a way to improve health. This was achieved by merged thinking and collaboration between commercial and academic sectors. While in principle this new functional food discipline supports the health benefits discovered by early 20th century nutritional scientists like Casimir Funk and Sir Frederick Gowland-Hopkins, in today’s media rich world it is unfortunately open to abuse via the excesses of advertising hyperbole. In particular, it is a big mistake to believe that more is always better. Often there is an optimal intake level—too little or too much being unhealthy. The best example of this is given by the antioxidant provitamin, β-carotene, which is associated with reduced cancer rates at normal levels of intake, but when used at supraphysiologic levels as a nutritional supplement to prevent cancer, it has the opposite effect. At these high levels it acts as a pro-oxidant, and therefore a free radical generator. In this respect, similarities between β-carotene and folate have been drawn. 18 At moderate levels of intake, folate can prevent birth defects, but at elevated levels it may have adverse effects. These have been reviewed elsewhere 18 but include counterintuitive, negative effects in vascular disease occurrence and cancer, and in the face of low vitamin B12 status, it is associated with an increase in cognitive decline in the elderly.
The interplay of nutrients and genes with other lifestyle factors has unpredictable complexity and makes gauging population measures an unenviable task. Nevertheless, enrichment of flour with niacin as a US government program to correct problems with nutrient deficiency was probably the first modern attempt to fabricate a food for functional purposes related to nutritional outcome. That is the eradication of pellagra. A great many countries are now doing the same for neural tube defect prevention through mandatory fortification of grain with folate.
Clearly, mandatory fortification is a broad, one-size fits all approach to contemporary public health medicine, but it is an effective way to bridge nutrient deficiencies. However, nutritional interventions may eventually become individualized through the emerging field of “personalized medicine.” As genomic technologies become cheaper and more widely available, this approach may become a viable way to tailor nutritional needs to the individual’s genetic profile and is likely to be one of the safest and most effective modes for applying dietary advice in the future.
Conclusion
The concept of Darwinian/evolutionary theory goes beyond nutritional considerations, but a big component within this concept does relate to nutrition, and in particular the mismatch between our genes, modern diet, and obesogenic lifestyle (See Figure 1). Knowing the evolutionary connection between food and disease should be assimilated into undergraduate medical training and gain greater focus within the biomedical research community and within the discipline of preventative health care in particular. 96 This fits into a broader framework in which scientists are converging on the idea that gene–culture interactions have been critical in human evolution, with population genetics modeling suggesting that hundreds of genes have been subject to positive selection in response to human activities. 156 The obvious focus of this review has been on foods, with many genes altering the metabolism of carbohydrates, starch, proteins, and lipids, as well as ethanol through cultural pressure. However, some of the earliest genetic effects related to diet must have occurred with the domestication of plants and the need to detoxify plant secondary metabolites. A wider implication of our nutritional past, not dealt with in great depth here, but which cannot be ignored, is the exposure to “crowd diseases” as a consequence of agricultural-related population aggregations. The discovery of fire and hence the invention of cooking is where this review began; it is also a good place to end by drawing attention to how smaller masticatory muscles in both modern and fossil members of Homo may stem from a muscle-related gene mutation in the predominant myosin heavy chain (MYH16). This occurrence reflects the gracilization of the modern human skeleton and parallels our accelerated encephalization. Occurring 2.4 million years BP, it represents the first proteomic distinction between humans and chimpanzees correlated to a traceable anatomic imprint in the fossil record 157 and must be a biological event/process concurrent with the discovery of fire, and hence the advent of the Anthropocene.
Where does the future lie? This is difficult to judge, but there is a clear need for more very large, prospective trials that examine detailed lifestyle and dietary factors that have the power to delineate the complexity of gene–environment interactions. Increasingly, such studies will need to draw on the fields of metabolomics, proteomics, transcriptomics, and epigenomics, and by strengthening the overarching discipline of nutrigenomics. This does not demote the importance of more traditional nutritional science in this area. Konner and Eaton recently reviewed the concept of Paleolithic nutrition, 158 as an update on the quarter century since they first published their evolutionary discordance hypothesis. 159 In their update, they underscore their original hypothesis that food is of key importance and that the departure from ancestral diets plays a major role in many modern diseases. The validity of this model remains to be proven, but the tide seems to be moving inexorably toward validation of their seminal paradigm. Nutritional anthropology is clearly an important discipline in which the distant dietary past is an important determinant of our immediate future health.

A few of the human genes that have responded to recent diet-related selection pressures, and some of the consequences of a contemporary diet–gene mismatch supporting the need for a Darwinian medicine perspective to understanding disease.
Footnotes
Acknowledgement
The authors wish to acknowledge the many research scientists at the University of Newcastle’s Human Molecular Nutrition Laboratory who have worked in the area of nutritional genetics and human health over the past 10 years, and who in varying ways have contributed to an improved understanding of this field.
Author Contributions
Declaration of Conflicting Interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The authors received no financial support for the research, authorship, and/or publication of this article.
Ethical Approval
Informed consent was obtained prior to volunteers participating in the study with the approval of the local human research ethics committee.
