Alzheimer’s disease (AD) is characterized by impairment of hippocampal episodic memory performance followed by a progressive decline of cognitive and social capabilities. Since AD is the major cause of cognitive decline and no curative drug has been developed, research worldwide is intense and highly competitive. Epidemiological, biochemical, molecular, genetic and animal studies provide different entry points into the complex disease process, which led to different theories about the aetiology of AD. Starting with age as the main cause and primary risk factor, AD is being explained by the oligomeric amyloid beta (Aβ) cascade hypothesis, which includes hyperphosphorylated and dysregulated tau [1], the intoxication hypothesis [2–4], chronic infections [5–7], microbiome composition [8], neuronal insulin resistance [9, 10], physical and functional breakdown of the blood–brain-barrier (BBB) [11, 12], chronic neuroinflammation, due to multiple causes [13], impaired neuronal rejuvenation (NRJ) [14], synaptic failure [15], and a growing list of many others.

All of these theories are more or less deeply embedded in the belief that aging per se is the main etiological cause. In fact, the thought that aging per se is the primary cause of AD is so deeply engrained in our thinking and appears in almost every introduction in any scientific paper about AD to be a compulsory statement, which is rarely challenged. But as I will argue, not only are there a number of serious arguments challenging the “age-is-the-primary-cause-dogma”, ageing as the overarching cause also hinders the development of a “unified theory of AD” (UTAD), which incorporates all key findings including the long list of well-known environmental and behavioural risk factors, hence explaining the aetiology and pathogenesis of this debilitating disease. In fact, the lack of a UTAD continues to limit the development of effective preventive measures and a curative treatment to trial and error. Therapeutic interventions that focus on such singled out mechanisms continue to fail [16]. In addition, prevention trials, which rather base their regimen on the correction of more or less arbitrarily selected risk factors than on a complete theory of AD, were so far also limited in their overall success [17, 18]. In contrast, the proposed UTAD overcomes our concept of age per se as the major cause for AD, and provides an encompassing explanation of the aetiology and pathogenesis of Alzheimer’s. It also allows proposing a number of required individual life changing interventions in order to prevent AD with high probability. In addition, the UTAD might provide the logical framework for a curative regimen, as will be outlined at the end of this review.

I would like to point out that I have termed the proposed theory “UTAD” because it presents a systemic neurobiological framework of how all currently known major behavioural, environmental or genetic risk factors individually or in combinations initiate or accelerate the AD process, despite certain caveats that apply for most if not all theories: Although I tried to be as comprehensive and exhaustive as possible in my search using key words like for instance “AD risk factor”, “factors inhibiting AHN” or “neuroinflammation” in the PubMed database of the National Center for Biotechnology Information, some minor risk factors might have been overlooked, it needs to be seen if they will verify or falsify the UTAD. The same goes for risk factors, which are already acting today but have not been identified yet, or which emanate from future individual lifestyle choices or cultural developments. Conversely, in some categories of risk factors (e.g. environmental toxins and chemicals), I purposely listed only examples, since a comprehensive list (e.g. of all currently known chemicals that negatively influence critical mechanism of the AD process as proposed by the UTAD), would not add to its understanding and therefore go beyond the scope of this review. It is my hope that once the principal concept of the UTAD is accepted, all risk factors that interfere with the neurobiological mechanisms, which, according to the UTAD, are at the centre of AD can be recognized and investigated more efficiently. Last but not least, behavioural risk factors in context of the UTAD might lead to discussions about free will or freedom of action, which would also go beyond the scope of this review.

Aging is required but not causal for AD

Measurement of insulin resistance of the hippocampal/temporal lobe by positron emission tomography with 2-deoxy-2-[fluorine-18] fluoro-D-glucose integrated with computed tomography (FDG-PET/CT) has become, besides amyloid-PET diagnostics [19] a highly specific and sensitive biomarker for AD, having predictive value even decades before the first clinical symptoms of AD manifest themselves [20]. One may conclude that the development of AD obviously requires time and, therefore, the logical consequence is that the risk of developing AD will increase with age. But correlation does not a priori equal causation. In this case, for a disease requiring time to develop, age might simply be a precondition but not necessarily a cause. If age was indeed the cause of the disease, AD would not only be a natural outcome of human aging but the fight against AD would be a fight against human nature, which is highly difficult to win.

But, fortunately, many lines of evidence disagree with this explanation (for example see [21, 22]). Particularly from a human’s life history point of view, if age per se was indeed the main causative risk factor, why was AD essentially unknown around the beginning of the last century? According to a recent estimate, age would have caused approximately 36 thousand new cases per year in the USA alone, making the disease very common [3]. But a textbook on neurology published in the late 19th century did not even mention an AD-like pathology [23], and in 1906, when Alois Alzheimer first published report about the pathology appeared, he described it as a peculiar brain disease [24], suggesting that it has been unknown before. And still in 1938, 32 years later, Alzheimer’s description of amyloid plaques and neurofibrillary tangles as hallmarks of AD brain pathology had not found their way in a comprehensive textbook of pathology [25]. It may be proposed that, at that time, death as a consequence of AD was a very rare event.

AD prevalence was maybe similar rare as in Japan at the mid of last century. A recent study provided significant evidence that not age but rather certain lifestyle factors explain the sevenfold (!) increase in AD prevalence over the last half of the 20th century [26]. This dramatic increase was strongly associated with a change from the traditional Japanese diet/lifestyle towards a Western one, which mainly took place between 1961 and 1985. In particular, besides a large increase in alcohol intake, the consumption of meat and animal products rose by seven- and fourfold, respectively. Animal products and meat are known to increase the risk of AD because they contain compounds such as excess iron (which particularly enhances the risk for ApoE4 carriers, as will be detailed below), advanced glycation end products (AGEs) and arachidonic acid that have been shown to increase oxidative stress and inflammation in the brain (which also will be detailed below). The dietary changes paralleled the dramatic increase in AD prevalence for those people aged 65+ years in Japan, which rose from a low 1 % in 1985 to about 7 % in 2008, with a lag time of about 20 years. This trend in disease prevalence can neither be explained by a change in life expectancy nor by a genetic drift. Such relatively short intervals are rather known for other behavioural diseases like e.g. lung cancer from smoking [27].

According to the author of this groundbreaking study, the AD prevalence rates in Japan may have reached a peak - like in other highly industrialized societies - and will not increase further, as these AD-causing behavioural factors have changed only modestly since 1985. Consequently they one reviewer of the study warns, “unless Japanese people return to the traditional Japanese diet, AD rates in Japan are unlikely to decrease” [28]. A similar increase in AD incidence from low to high rates can be observed in emerging economies. For instance, the age-specific AD incidence rate in rural India was shown to be about four times lower than in the USA [29], its rise rather parallels the rate of economic growth, which strongly influences the lifestyle [30]. Similarly, the prevalence of AD among African-American populations living in the US is several-fold higher when compared to age matched Africans in their homelands [31, 32]. Japanese who migrated to the US and adopted the American way of life increased their AD-risk [33]. Hence, not ethnical origin, but rather the adopted modern lifestyle has an impact on AD risk [34]. In other words, our individual life history might play a decisive role in shaping the AD risk.

According to a reanalysis of the Framingham Heart Study, the incidence of dementia over three decades almost halved [35]: The 5-year age- and sex-adjusted cumulative hazard rates for dementia were 3.6 per 100 persons during the first epoch (late 1970s and early 1980s), 2.8 per 100 persons during the second epoch (late 1980s and early 1990s), 2.2 per 100 persons during the third epoch (late 1990s and early 2000s), and 2.0 per 100 persons during the fourth epoch (late 2000s and early 2010s). Compared to the incidence during the first epoch, the rate declined by 22, 38, and 44 % during the second, third, and fourth epochs, respectively. Again, these data are inconsistent with age per se being the major cause of AD. According to the authors of the study, the key contributing factor to this decline have not been identified, but it is important to note that this positive trend was observed only (!) among persons with at least a high school diploma. For those with lower education, the AD-risk was actually rising by 66 % over the four decades. This indicates that education and/or income, hence socioeconomic factors, might play a pivotal role. But we can certainly rule out age, as low socioeconomic status reduces, rather than increases, lifespan [36]. According to the authors of another study [37], which reported a similar decline, the increased use of antihypertensive, lipid lowering and antidiabetic drugs might have contributed to these positive trends, as well as experiencing less stressful life events like wartimes, which particularly afflicted the older generations in former studies, from which the earliest estimates were derived. Taken together, life choices and life experiences appear to be of etiological significance for AD.

Results from research based on animal models also argues against aging being causative for AD. Laboratory animals are kept under so-called standard housing conditions which represents their “lifestyle”, which is in some aspects quite similar to the life in western societies. This “Western-type-lifestyle” is sedentary, lacks social activities and, comparably, the animals suffer chronic sleep deprivation since they are constrained to an unnatural but standard 12 h dark/light-rhythm. Furthermore, an unnatural ad libitum feeding pattern is also applied routinely to animals in experimental research, leading to a misinterpretation of experimental results, particularly in AD research. Not the animal living in its wild habitat, where e.g. physical activity and intermittent fasting (IMF) is natural, but the sedentary caged animal, which is fed ad libitum, became the standard to which we tend to correlate and to interpret all data down to the cellular and molecular level. In contrast, a more natural housing, named environmental enrichment (EE), which includes social activity in larger enclosures and environmental complexity (e.g. the presence of objects that can be manipulated, structures for climbing or exercise, foraging opportunities, hiding or nesting areas), hence a more natural situation, is regarded as a experimental condition [38]. But it should be vice versa, since the conditions under an EE mimic in important aspects those of our own pre-modern lifestyle, hence the general conditions to which our genetic program is adapted too. The standard housing condition should therefore be regarded as the experimental condition, in which important factors for the physical and mental well-being are eliminated and the consequences for aging and AD being studied. This change of perspective would help to regard AD primarily as a deficiency disease, i.e. a result of discrepancies between behavioural requirements of an organism and its actual lifestyle conditions.

In line with the proposed UTAD, even old mice maintained in EE show a robust fivefold increase in adult hippocampal neurogenesis (AHN) [39], which is an important aspect in regard to AD, as will be outlined in detail below. Furthermore, they exhibit reduced anxiety and depression-like behaviours [40], and show improved memory performance compared to animals maintained under standard housing condition [41]. Therefore, once we start looking at animal models of aging or AD from an evolutionary perspective, experimental deficiencies (under current standard housing conditions) in physical exercise, IMF, social activities or essential nutrients would become obvious causes of AD (see below). For example, one year of IMF prevented cognitive decline even in a triple-transgenic mouse model of AD (which express the APPswe, PS1M146V, and tauP301L mutations [42]), providing evidence that not aging but rather the feeding pattern have long-lasting effects on memory [43]. More examples will follow, which in combination have one important consequence: Most results obtained from animal research (as well as from epidemiological studies) regarding aging and AD need to be reinterpreted from an evolutionary point of view: “Normal” aging under “Western-style” housing is not natural, and the same might apply to AD in humans.

Lifestyle choices initiate and genetic predispositions accelerate AD pathogenesis

Evolution drives primarily by the selection of advantageous genetic variants. The dramatic trends in AD incidence in industrialized countries over the last century and the similar rise in emerging economies nowadays can therefore not easily be explained by genetic influence. It is more likely that AD follows similar lifestyle changes that parallel the well-known increases of type-2-diabetes, obesity, high blood pressure and arteriosclerosis, all well-known risk-factors for AD [44].

Another rather crude example serving to explain the interplay between a certain lifestyle choice and AD risk and demonstrating interplays with a presumed genetic predisposition comes from the USA. Compared to the general population, players in the national football league (NFL) have a threefold increased risk of developing major depression [45] and those who incurred three or more concussions during their careers had a fivefold higher risk for amnestic mild cognitive impairment (MCI) [46]. According to the authors of the NFL-study, traumatic brain injury is a causative risk factor for AD and other forms of dementia. The average age at AD diagnosis was 53.8 years, which indicates that, in these cases, it is definitely not age which increases AD risk. Several other studies confirmed these finding (for review see [47]): In particular speed players who commonly build up considerable momentum prior to tackling or being tackled, showed even a significant sixfold higher mortality resulting from AD and from amyotrophic lateral sclerosis in comparison with the general US population. Interestingly, ApoE4-carriers (i.e. carriers of a specific genotype variant of the apolipoprotein polymorphism) are more susceptible to concussion-mediated AD-risk, which is in line with the experimental observation that ApoE4, in contrast to ApoE2 and ApoE3, makes particularly the BBB more vulnerable to proinflammatory insults. This alterations lead to an increased neuronal uptake of multiple blood-derived neurotoxic proteins, as well as microvascular cerebral blood flow disturbances. In ApoE4-expressing animal models, those vascular defects precede neuronal dysfunction and can initiate neurodegenerative changes [48].

The ApoE4 polymorphism is known as the most common genetic risk factor for sporadic AD with 15 % Caucasians being carriers. Although ApoE4 is neither necessary nor sufficient for the development of AD, having one or two copies of the ApoE4 allele increases late-onset AD risk about 3- to 12-fold, respectively [49]. Interestingly, ApoE4 is a uniquely human allele and the most ancestral, and its appearance in evolution marks the dramatic increase in the human lifespan (for review see [50]). In contrast, the human ApoE3 allele appears to be neutral regarding AD risk, and its mutation emerged much later in human history; its frequency increased during human evolution with 75 % of Caucasians now being carriers. Why has the ApoE4 allele not been completely replaced by natural selection with the health-beneficial ApoE3 or ApoE2 allele, which even appear to lower AD risk? There are several possible explanations. For instance it is assumed that alleles that are detrimental to health in older age might persist in populations because they confer, by means of antagonistic pleiotropy, some benefit to younger individuals [51]. ApoE is a major supplier of the cholesterol precursor for the production of oestrogen and progesterone. According to one recent study, women who carry at least one ApoE4 allele have significantly higher levels of mean luteal progesterone than women that carry only ApoE2 or ApoeE3 alleles. ApoE4 might therefore be advantageous regarding fertility and therefore in reproductive performance [52]. Another study found an association of ApoE4, when compared to ApoE2 and ApoE3, with good episodic memory and an economic use of memory-related neural resources in young, healthy humans [53], albeit other studies failed to provide evidence for such a benefit [54].

Whatever the advantage of ApoE4 for some might be, we should be aware that all our observations regarding the disadvantages of any genetic variant with respect to AD were observed in human populations (or caged animals) that dramatically changed their lifestyle in recent history. We simply do not know if ApoE4 would be neutral or might even provide a benefit for cognitive health in the elderly, if current behavioural and environmental deficiencies are omitted. Indeed, it is noticeable that particularly the detrimental effects of the ApoE4 polymorphism on cognition may strongly depend on modifiable risk factors (for review see [50]). For example, in some studies, engagement in physical activity seems to reduce AD risk primarily in ApoE4 carriers [55], in line with the “grandmother hypothesis” (see below), which argues that the evolution of the long human lifespan likely required older individuals to maintain high levels of both physical and cognitive health. Like the NFL-example discussed above, recent cohort studies provided evidence that ApoE4 makes us more vulnerable to unhealthy lifestyle choices. In one study, hypercholesterolemia was associated with cognitive impairment mainly among ApoE4 carriers [56]. Hence the advantage in fertility, as has been outlined above, might turn into a disadvantage upon altered behaviour that leads to increased cholesterol levels. In another study, which will be detailed below, ApoE4-subjects adhering to a healthy Mediterranean diet (MeDi) had the greatest benefit regarding brain health when compared to other genotypes [57]. This tendency was confirmed by the finding, that particularly carriers of the ApoE4 allele profited from nutritional consumption of fish, as their AD risk dropped by a factor of two, whereas non-ApoE4-carriers had no such advantage [58]. Furthermore, a recent study provided evidence that ferritin levels in the cerebrospinal fluid (CSF) are negatively associated with cognitive performance in cognitively normal (i.e. averagely mentally declining with age), MCI and AD subjects, and predicted speed of MCI conversion to AD [59]. Interestingly, the elevated CSF-ferritin levels were found to be strongly associated with the ApoE4 genotype. From an evolutionary point of view, ApoE4 carriers might have had an advantage as their brain was provided with sufficient iron in situations of low nutritional iron availability. Since nowadays diets usually contain large amounts of iron-rich animal products (as outlined above for the modern Japanese society), the potential physiological advantages of ApoE4 in iron metabolism might fire back, leading to elevated brain iron, which promotes enhanced production of reactive oxygen species (ROS) and adversely impacts on AD progression.

Hence it is conceivable that the most common genetic risk factor for sporadic AD, namely ApoE4, might be important for brain health under those conditions that prevailed for the longest part of our evolution [50]. Therefore, it might now only be regarded as a risk factor under unhealthy lifestyle choices (like professionally acquired head traumata, favouring life-styles that lead to increased cholesterol levels or iron intake, or low physical activity etc.). The ApoE4-allel might therefore not be a cause of AD but might rather act as an accelerator under such (unhealthy) conditions. Indeed, each ApoE4 allele exerts a dose-related earlier onset of AD [60]. The situation might be similar to the so-called obesity alleles, which simply provide the carrier with a genetic variant that actually improved the fitness of his ancestors by being particular efficient in energy conservation under living conditions, which alternated between IMF and feasts after successful foraging. Nowadays, under lifestyle conditions that provide a steady energy supply under minimal physical expenditure, the same genes are regarded as “disease genes”. The good news, both for carriers of “obesity genes” and particularly of ApoE4 is that they will benefit most from a return to a healthy lifestyle.

Taken together, the aetiology of AD appears not to be different from any other current “culture-borne diseases”, which are caused by well-known deficits in physical exercise, intake of healthy food, intermittent fasting and social activity. The AD epidemic might simply be due to the development of a highly sedentary, overindulgent and occupation-specific lifestyle that largely eliminates extended family bonds and time to relax. In order to prevent and treat AD, we might need a better understanding of how the true causative risk factors interact. But, much more, we need to become more open-minded: We should stop regarding AD as a natural (genetic) disease or a causal consequence of aging and begin to accept that AD is caused by environmental factors and behavioural deficiencies (i.e. the well known risk factors). Consequently, AD should be preventable and, in the early stages of the disease, might even get cured, provided that we (re-) assume a lifestyle that satisfies all essential requirements of our brain, which result from our human’s particular evolutionary life history.

Evolution of human longevity

We are a product of evolution. Therefore, any explanation of a human ailment like AD must be footed on evolutionary theory if we want to regard the aetiology and pathogenesis as fully understood.

The second law of thermodynamics describes the natural instability of any information content, which includes the one encoded in the DNA in all our cells. Hence life had to overcome a principal obstacle and find a way to preserve genetic information. Nature’s solution was the repetitive duplicating of genetic material (analogous to the need to frequently copy our favourite data stores if we do want to avoid losing their information content). The quality control mechanism after the copying process is selection of those copies that provide their carrier with the ability to efficiently continue the DNA duplication process. This mechanism allowed adaptation to an ever-changing environment and the evolution of quite different reproduction strategies. Bacteria on one end of the spectrum engage in mass production and mass genetic variation to increase the probability of survival of at least a few genetic copies. Humans, on the other end, only have to generate relatively low numbers of descendents in order to survive as a species, since their large brain permits them to engage in complex social collaborations and to use acquired knowledge to enhance the chance of their progeny’s survival: According to large and complete multi-generational demographic records of pre-modern Canada and Finland, for every post-reproductive decade at least two more of their grandchildren reached reproductive age, when compared to families with the grandmother dying earlier [61]. Therefore, the older the grandmother beyond 60, i.e. 70, 80, or 90 (i.e. the age, when age supposedly causes mental decline), the more she was able to increase the number of her grandchildren and the chance of their survival into adulthood, longevity became a selected trait acting by transgenerational generativity. Generativity in a psychosocial sense refers to the human concern for establishing and guiding the next generation.

This particular evolutionary strategy developed early in human history and is known today as the grandmother-hypothesis (GMH) [62]. The GMH explains the evolutionary origin of our exceptional longevity when compared to our genetically closest cousins, the chimpanzees, which die a few years after reaching menopause [63]. The objection to the GMH, that longevity is a rather modern phenomenon, since hunter-gatherer societies have low average life expectancies, can be repudiated, as those low averages reflect high infant mortality and not individual life expectancy. Indeed, in still existing hunter-gatherer societies that are insulated from modern life, about two thirds of those who reach adulthood actually survive up to an age of 70 years, even encountering an 80-year old might be no exception [64]. Reproductive age in hunter-gatherer populations was between 15 and 45 years of age, hence a grandmother became first independent of rearing her own children when her last own child reached the childbearing age of about 15 years. Hence exclusive grand-motherhood, i.e. the years independent of motherhood, started at around 60 years of age and had to last for at least another 15 years, until the first grandchildren of her latest child reached adulthood. Interestingly, the GMH was recently supported by observations of extended Orca whale families [65]. As in humans, female Orcas can live several decades beyond menopause and their reproduction strategy also relies on the use of lifelong acquired knowledge. In particular, the oldest post-menopausal females lead their groups in salmon foraging grounds and organize the hunt especially when salmon abundance is low, which would otherwise reduce reproductive success and drive mortality rates.

Taken together, the GMH contradicts principally with the theories of AD being a general error in our genetic program or simply being caused by aging, since the evolution of longevity as a reproduction strategy requires a brain functioning well particularly at higher age to endow the kinship with the acquired life experience. Furthermore and as the GMH predicts, under conditions closer to those that were active during the long pre-modern area of human development and to which our genetic program is exquisitely adapted, the human brain maintains the lifelong facility to acquire empiric knowledge [66]. One key mechanism is the lifelong growth of our memory store by adding new neurons through AHN, another is the lifelong maintenance of already existing neurons, i.e. neuronal rejuvenation (NRJ) through the mechanism of autophagy and regeneration of subcellular structures and organelles. Both mechanisms, NRJ [67] and AHN [68] appear to be efficiently functioning even at high age, which is in line with GMH (see below), but require certain environmental cues and depend on our lifestyle choices, and if defunct, both are also closely and causally linked to the aetiology of AD, as I will also show below.

Lifelong neuronal rejuvenation (NRJ)

Mitochondria sustain cellular bioenergetic homeostasis by playing key roles in a broad range of core cellular functions, including energy production, metabolic and calcium signalling, and in a number of biosynthetic pathways [69]. They also perform a variety of functions in processes such as the transduction of metabolic and stress signals and the production of free radicals such as reactive oxygen species (ROS). Originally envisioned as a “necessary evil”, i.e. a by-product of an imperfect oxidative metabolism, ROS are now recognized for having an essential signalling function in cellular physiology [70]. Only in case of mitochondrial damage, excess accumulation of ROS evoke an inflammatory response, which inhibits AHN, but also leads the mitochondrial host to engage the cell death program, which in case of neurons leads to apoptotic neurodegeneration [71]. Hence chronic perturbance of mitochondrial function aggravates the pathogenesis of many neurodegenerative diseases. A complex defence mechanisms against this type of oxidative stress has been resourcefully adapted, which consists of endogenously generated ROS sequestering molecules and enzymes (e.g., α-lipoic acid, super oxide dismutase, glutathione) and many bioactive nutrients (e.g., plant-derived vitamins and polyphenols), which detoxify the harmful effects of ROS, while maintaining their signalling capacity.

In addition, cells like neurons that have a low or no turnover at all only survive up to high age of the individual by maintaining intracellular youth through the process of NRJ [72]. Central to maintaining cellular bioenergetic homeostasis is to sustain a healthy mitochondrial population to ensure that cellular energy demands are met by energy supply [71]. To this end, the exchange of damaged macromolecules and cellular organelles including mitochondria evolved as an active and highly regulated process [73], requiring behavioural cues to initiate. Accordingly, accelerated aging [74], memory decline and AD [75] might be caused by a lack of these cues. Conversely, the accumulation of non-functional macromolecular and damaged mitochondria affect cell function with symptoms arising, when over many years the rate of damage exceeds the rate of repair and turnover [76]. Accordingly, autophagy is important to slow aging, inflammatory processes and cell death [77].

The master regulator/inhibitor of autophagy is the mammalian target of rapamycin (mTOR) [78]. This intracellular kinase functions as a key signalling node that integrates information regarding extracellular growth factor stimulation, nutrient availability and energy supplies [79]. The fungal metabolite rapamycin was accidently found to block mTOR and became not only the eponym of mTOR but also the main molecular tool to dissect mTOR-function. Rapamycin treatment was found to activate autophagy by inhibiting mTOR [80], thereby slowing down both aging and cognitive decline of caged mice [81], suggesting that inefficient autophagy as part of the NRJ-program might be a central element of both processes. Conversely, caging (standard housing) might eliminate an important behavioural cues (like for instance physical activity or intermittent fasting, see below), which leads to unnaturally high activity of mTOR and low activity of peroxisome proliferator-activated receptor γ coactivator 1α (PGC-1α) [82], the master controller of mitochondrogenesis [83] (see below), thereby inhibiting neuron-rejuvenating and protecting autophagy [84]. Hence the observed aging and cognitive decline in murine models of AD might be regarded as artificial and not reflecting the aging process under natural conditions.

Other behavioural cues that inactivate mTOR are, besides physical exercise, which was shown to promote autophagy of defunct organelles and macromolecules in the brain [85], chronic caloric restriction (CCR), which is well known to delay aging and extend life-span in essentially all eukaryotic organism [86]. But is CCR a physiological cue or rather an artefact of experimental research that simulates, to some extent, a more natural dietary pattern, namely intermittent fasting (IMF), which from an evolutionary point of view is more natural (see below)? But IMF is experimentally more labour-intensive than CCR and therefore less well studied. Nevertheless, autophagy and in particular mitophagy was found to being activated by CCR through inhibition of mTOR in essentially all species investigated, ranging from yeast, to flies, worms, fish, rodents and even to rhesus monkeys [87], thereby decelerating mTOR-driven aging [88]. CCR not only extends lifespan, it also protects the central nervous system from neurodegenerative disorders, whereas excessive caloric intake is clearly associated with accelerated aging of the brain and increased the risk of neurodegenerative disorders due to suppressed autophagy [89].

Nevertheless, IMF was shown to create a more robust and steady inhibition of mTOR-accelerated aging and cognitive decline when compared to CCR [90]. This is explained by the fact that the main hormone-like signalling molecules of the metabolic status during IMF, the ketone bodies acetoacetate (AcAc) and D-β-hydroxybutyrate (βOHB), are more efficiently generated during fasting than by CCR [91]. These two respiratory fuels can endogenously be produced by the liver in large quantities (up to 150 g/day) from mobilized fatty acids in a variety of physiological [92] or pathological conditions [93]. In humans, basal serum levels of βOHB are in the low micromolar range, but rise up to several hundred micromole after 12 to 16 h of fasting [92]. Importantly, when blood glucose and insulin are low [94], up to 60 % of the brain energy needs can be derived from ketone bodies, replacing glucose as its primary fuel [95]. Similar high levels of up to 1 to 2 millimole βOHB are reached after prolonged endurance exercise [96]. A physiologically relevant increase in ketone body production is already achieved by fasting overnight, which can even be enhanced if we are physically active before breaking the fasting in the morning [97]. This most likely mimics the situation that faced our foraging ancestors who went out for hunting or gathering food with their stomachs empty.

Since neither long- nor medium-chain saturated fatty acids can pass the BBB, only their transformation into ketone bodies allows our energy-demanding brain to access the largest energy store, our adipose tissue. In fact, ketone body production reduces glucose requirement and preserves gluconeogenic protein stores during fasting, which enables a profound increase in the capacity for survival [98]. Interestingly and again in line with the GMH, elderly generate ketone bodies at least as efficient as younger adults during IMF [99] and the metabolic response to a ketogenic diet appears also to be unaffected by aging [100].

As hinted at above, the observation that IMF is superior to CCR makes also a lot of sense from an evolutionary perspective, as not chronic starvation but rather periodic alteration between fasting and intake of high-caloric meals after successful foraging was ancient normality. Importantly, recent evidence suggests that our phylogenetically conserved genetic program uses the metabolic changes that originate from intermittent fasting (IMF) as a behavioural cue of for the initiation of subcellular renewal [101]. This is a good thing, since in order to maintain cellular youth, we do not have to starve by CCR. It is sufficient to alternate phases of fasting, which just need to be sufficiently long to induce ketone body production (for instance 12 h overnight) and phases of eating, in which the total energy demand of our body can be met. In contrast, current normality consists of constant feeding pattern, which results in permanent high mTOR activity (and low PGC-1α-levels, see below), which suppresses cellular rejuvenation. A sedentary lifestyle aggravates this pro-aging effect, whereas prolonged physical exercise reduces mTOR-activity [102], possibly also by increasing ketone body production.

Taken together, inhibiting mTOR by being physically active and by IMF is a natural way to stay healthy. But there are always hopes that instead we can put a healthy lifestyle in a pill: Rapamycin is an approved immunosuppressant in humans used to prevent rejection in organ transplantation and meanwhile clinical trials are under way to test efficacy in AD (for review see [103]). But given the fact that IMF, a nutrient-rich diet and regular aerobic exercise and sleep have more beneficiary functions than just inhibiting mTOR, it is questionable if long-term health can be achieved by a drug without a lifestyle change. For instance, many of the well known beneficiary effects of intermittent fasting (IMF) as well as physical exercise are mediated by the upregulation of PGC-1α. But activation of PGC-1α also reduces neuroinflammation [66], for instance by inducing the expression of ROS-detoxifying enzymes [104]. Hence two potential causes of aging, chronic mitochondrial dysfunction and neuroinflammation (see below), can be explained by reduced levels of PGC-1α [105], and might originate from either a lack in physical exercise, IMF, or both; or from many other behavioural deficiencies, which will be discussed in detail below. Hence I doubt that behavioural deficiencies can be compensated by providing a single drug.

In summary, behaviourally suppressed cellular rejuvenation affects all organ systems and leads to frailty when we age. But we should not fall into the trap of blaming frailty or aging per se as the primary cause of physical and in particular cognitive decline [106]. “Normality”, as we usually define and see it, might be misleading, since it only reflects a statistical mean of current behaviour and its consequences (like e.g. the average decline of cognitive abilities with age). In fact, we prolong our lifespan despite unhealthy aging because we nowadays benefit from intense medical care [107]. But the ability to reversing the trend is in our own hands, thanks to the genetic program that evolved with our ancestral grandmothers.

Physiological role of amyloid beta (Aβ)

The hippocampus complex (HC), i.e. the archecortical hippocampus and entorhinal cortex, is the central organ for remembering personal life experiences [108]. Since every moment in life is unique and cannot be repeated like vocabulary, this old part of our brain has maintained the ability to learn essential information instantaneously. This capability is a prerequisite for spatial navigation in new terrain like during gathering and hunting, as well as social learning from personal interactions, narrations and is even important when remembering long-term ones’ own thoughts. As an indicator of their potential significance for survival and reproduction, the HC selects experiences for memorization by their emotional effect. (Unfortunately, vocabulary is rarely exiting enough to be fast remembered). The HC thereby memorizes at least three interrelated aspects of the life episode: what happened (content information), where it happened (spatial information) and when it happened (temporal information). As the HC has a limited memory capacity, the daily acquired content information is stored only transiently. In order to remember important (emotionally exiting) life events lifelong, the content information is being consolidated in the neocortex during the slow-wave sleep (SWS) phases in the early part of the following nocturnal sleep [109]. In the second part of nocturnal sleep, the new memory is being cross-linking with former experiences during the rapid eye movement (REM) phases, a process thought to inspire insight while we dream [110]. The spatiotemporal information, i.e. the contextual “where and when” of the memorized event, is maintained long-term in the dentate gyrus (DG) network of the hippocampus and used as an index for retrieval (reactivation) of the neocortical memory traces (see below).

Hippocampal fast learning relies on the mechanisms of synaptic long-term potentiation (LTP) using glutamate as a key neurotransmitter. Glutamate alters the strength (weight) of specific synaptic connections thereby creating a new hippocampal memory trace. Any increase of neuronal activity induces concomitantly an increase of β-amyloid (Aβ) production and release on the same synapses, where Aβ plays an important role in learning memory formation [111]. Aβ is mainly secreted in monomeric form with 40 amino acids (Aβ40), which has a slight tendency to aggregate. Conversely, Aβ42 is produced in low quantities and is more prone to the formation of oligomers, protofibrils and fibrils. These aggregates represent the main form of Aβ contained in AD brain plaques [112]. This observation and the fact that conglomerating Aβ42 is produced in higher quantities by certain AD-mutations [113], has led to ascribe pathological effects to the oligomers and the physiologic properties of Aβ almost solely to monomers. It was also the foundation of the amyloid cascade hypothesis.

However, a certain degree of oligomerization is likely to occur even at low Aβ concentrations, hence the native state characteristics of the peptide and in which it exerts its multiple functions are difficult to elucidate [114], particularly, as the oligomeric states of Aβ vary [115]. In any case, the concentration of soluble Aβ in the normal healthy brain has been estimated in the picomolar range with species ranging from monomers to higher oligomers [116]. One important physiological function of Aβ might be as a regulator of glutamate release probability [117], whereby the activity-dependent modulation of Aβ production acts as a negative feedback regulator of synaptic plasticity [118]. It has been shown, that at lower, picomolar concentrations, Aβ increase LTP, whereas at higher, nanomolar concentrations, Aβ leads to a reduction of potentiation [119]. This is in line with the proposed mode of action, in that Aβ, depending on the dose, either acts as an agonist, supporting the fast formation of new memories, or antagonist, preserving new memory traces from being overridden by subsequent events that are to be memorized [120].

If this negative feedback loop gets compromised, either to much Aβ is being produced or to little might get cleared. Subsequently, higher concentrations of soluble oligomeric Aβ form and exert synaptotoxic effects, even without plaque formation [121]. This might explain the relative high proportion of symptomatic patients with clinical signs of early AD and a negative scan for Aβ-depositions, the so-called suspected non-amyloid pathology [122]. With the realization that soluble Aβ oligomere distribution correlates better with cognitive decline in AD than the prototypical amyloid plaques [123], the former amyloid cascade hypotheses was replaced by the oligomeric cascade hypothesis [124].

The build-up of high amounts of oligomeric Aβ induces neuroinflammation, causes oxidative damage and negatively influences multiple signal transduction events including the activation of glycogen synthase kinase-3β (GSK-3β) [125], a pivotal kinase in AD but also for memory consolidation, which limits its use as a target for AD therapy. For instance, activation of GSK-3β inhibits AHN and promotes neuroinflammation and apoptosis (for review see [126]). Furthermore, its upregulation leads to phosphorylation of the amyloid precursor protein (APP) and tau, both associated with the pathological processes that lead to the hallmarks of Alzheimer’s disease (AD), i.e. Aβ-plaques and neurofibrillary tangles, respectively [127]. GSK-3β activity thereby shifts tau, a stabilizer of microtubules, to a hyperphosphorylated non-functioning state (p-tau) [128]. Tau pathology is progressive and detrimental to affected neurons via both, loss of tau function [129] and gain of toxic function from pathologic p-tau aggregates [130]. Hence not only high concentrations of oligomeric Aβ but maybe rather dysfunctional p-tau might be regarded as a key driver of AD progression, which is corroborated by the fact that p-tau deposition more closely correlates with disease stage [131].

But are Aβ or p-tau the initiators of consequences of AD pathogenesis? This question has been debated for a long time. For example, the proponents of the amyloid cascade hypothesis blamed primarily age as the causative factor for Aβ-plaque formation and the focus of pharmacological efforts were therefore driven towards the elimination of amyloid plaques. Ironically, not the insoluble plaques but, as mentioned above, rather the excess of soluble oligomers were recently shown to be harmful [132]. But even the removal of Aβ-oligomers by current pharmacological strategies that evolved from the newly proposed oligomeric cascade hypothesis might turn out to be harmful. Not only because the primary causes of AD remain ignored (see below), but also, because Aβ is in fact required for synaptic plasticity. For instance, the healthy physiological function in LTP and memory appears to depend on a delicate balance between the secreted Aβ monomer (in different length and concentrations) and multiple oligomeric states (for review see [133]). This caveat might explain why in experimental animal models for AD, the removal of Aβ by different antibodies against Aβ, which have also used in study settings for human treatment, was not only ineffective at repairing neuronal dysfunction, but also caused detrimental cortical hyperactivity [134]. It has been concluded that this unexpected finding provides a possible cellular explanation for the lack of cognitive improvement by immunotherapy in human studies [135, 136]. In other words, the sole attack against Aβ interferes with neuronal homeostasis, causing a further impoverishment of learning and memory.

Hence the current key strategy to cure or AD at least to delay disease progression might even worsen the status of the patients. As Puzzo et al. recently put it [133]: “the vision of Aβ exclusively as a”bad’ protein has probably prevented us to focus on other important aspects of the disease. “The authors point out further that”Aβ is already present inside neurons in infant brains, and even increases up to 8 years of age, a period of high brain plasticity, when about half of the neurons are Aβ-immunopositive. In adulthood, Aβ is present in the major part of the neurons whereas in aged people there is a 20 % reduction. “Furthermore, for the deposition of Aβ in the brain, only Aβ seeds are critical but not the age of animal models [22]. But if not age, what other factors might lead to Aβ seed formation? Only the answer to this critical question will open a pathway to AD prevention strategies and an AD therapy that is truly causal. But it becomes more and more clear that a rise of Aβ-concentrations is neither inevitable nor a natural consequence of aging. Rather other factors are playing a role and, intriguingly, most of them can be modified in various ways, one of them being sleep.

Memory growth during sleep

One key function of sleep is the consolidation of new hippocampal memory traces in the neocortex for long-term storage and gaining lifelong experience by integration them into the existing body of knowledge. According to the synaptic-homeostasis hypothesis, this is achieved by repetition of the memory content during SWS but also by the differential renormalization of synaptic weights, which includes selective long-term depression, essentially the reversal of the effects of LTP [137]. Correspondingly, in order to enable hippocampal encoding of new memories during the following wake-phase, Aβ levels, which accumulated during the information collection phase the previous day, needs to be reduced in order to reactivate the full potential of LTP. Hence Aβ-clearance might have developed as another active function of sleep and is accomplished via an enlargement of the interstitial space, which results in a striking increase in convective exchange of interstitial fluid with cerebrospinal fluid and export of Aβ across the BBB [138]. This restorative action regarding memory function also reduces the risk of the concentration-dependent aggregation of larger quantities of potentially oligomeric synaptotoxic Aβ [139]. It is therefore an important finding that lack of sleep in mice and humans increases Aβ concentration above the critical threshold where massive oligomeric aggregation occurs [140, 141].

Another important function of sleep is to provide the temporal space for AHN. During sleep, cortisol, which in high concentrations inhibits AHN, is downregulated, whereas insulin-like growth factor 1 (IGF-1), growth hormone (GH), melatonin as well as BDNF, which all promote AHN, are upregulated. Hence prolonged sleep deprivation is detrimental to AHN [142] and thereby decreases the number of possible distinct codes (indexes) that boost hippocampal memory capacity and performance (see below) [143]. Similarly, posttraining ablation of adult-born neurons was shown to destroy previously acquired memories [144]. New granular cells in the DG are particularly important in differentiating former from similar but novel experiences by remembering their spatiotemporal context, and thereby reducing interference of new with former memories [145, 146]. Furthermore, in order to remember, i.e. to access, retrieve the respective event-specific neocortically distributed stored memory traces and to reconstruct the contextual experience, the hippocampal spatiotemporal information of an remembered event is required [147], as originally outlined by the hippocampal memory index theory (HMIT). According to the HMIT, hippocampal–cortical system consolidation of remote memories requires the maintenance of hippocampal indexes [148]. Hence, we remember episodes of our life by the spatiotemporal context stored by the new neurons generated by AHN [144]. Therefore, remote memories are best maintained by the lifelong creation of new adult-born DG-neurons [149]. The expansion of the spatiotemporal memory capacity thereby becomes also a prerequisite for the continuous expansion of autobiographic memory [150]. This explains, why a disturbed AHN not only causes the hippocampal archive of indexes that link to episodic neocortical engrams running out of storage capacity, but also, why it causes discrimination errors (interferences) between former and new experiences which leads to an overgeneralization of fear and sustained posttraumatic stress [151]. Recently it was shown that in transgenic mouse models of early AD, direct optogenetic activation of hippocampal memory engram (index) cells results in memory retrieval despite the fact that these mice are amnesic in long-term memory tests when natural recall cues are used, which reveals a retrieval, rather than a storage impairment [152]. Interestingly, optogenetic induction of LTP at perforant path synapses of dentate gyrus engram cells restores both “age-dependent” spine density and long-term memory of the caged animals, explaining, why for instance social activity prevents memory decline in an AD model of environmental enrichment, and, thus, not age, but rather an unnatural lifestyle causes AD in these models [153], as will be outlined in more detail below.

The reduced interference of new with former experiences in an expanding archive of indexes explains comprehensively the importance of AHN and how this process promotes persistence [154], as well as precision of contextual memory [155]. In order to remember efficiently, adult-born hippocampal brain cells show a lower threshold for LTP, i.e. learning, while in turn, LTP up-regulates dentritic spine density, with the largest changes occurring during the early phase in their maturation, when they begin to form synapses with the existing circuitry [156]. Via these mechanisms maturating young adult-born DG-cells are being converted into high information neurons specifically by the newly experienced events. Interestingly, conditions, like for example voluntary exercise, that are required to stimulate AHN, also increase spine density in the hippocampus, leading to increased synapse formation and decreased synapse elimination as well as increased survival rates of the maturing neurons [157]. Following this observation, some researchers suggested the hypothesis, that a primary function of AHN involves the production not only of new neurons per se, but of new neurons with the synaptic properties of relatively young neurons [158]. As this mechanism provides for a turnover of almost or 2 % of the cells per year in the DG, AHN has the potential to keep this brain region, critical for acquiring novel experiences, young even at old age [159]. But only, in line with the UTAD, if we provide the behavioural cues for the initiation of AHN as well as the maturation and integration of the new DG-neurons (see below).

AHN regulates mood and has antidepressant properties, implications for AD

It is of interest that in AD, the perirhinal (PRH) and lateral entorhinal cortex (LEC) of the HC are among the first cortical regions to be affected [160, 161]. Both regions were found to provide the most critical content information (i.e. “what is happening”) to the newborn DG-neurons via the perforant pathway projections (for review see [162]). In addition, back-projecting signals from the CA3 region of the hippocampus, which receives direct signals from the medial entorhinal cortex (MEC) provide the required spatiotemporal information (i.e. “where and when is it happening”) to the newborn DG-neurons [163]. Both inputs appear to be required for event or pattern separation [164]. The findings that hippocampus degeneration is one of the most prominent and earliest characteristics of AD [165] with the perforant pathway being one of the very first structures to suffer damage [166], and that intra-hippocampal white matter lesion load is strongly associated with progressive MCI [167], supports the hypothesis that chronically disturbed AHN might play a etiological key role in AD pathogenesis. This hypothesis is further supported by the evidence that all behavioural deficiencies as well as environmental toxins are known to impair ADH and to increase the risk of AD (see below). Therefore, a non-productive AHN might connect not only histopathologically but also functionally with the aetiology of AD. Hence the understanding of this connection and the requirements for a productive AHN might help to develop a new understanding of AD and a causal strategy for prevention and therapeutic intervention.

Besides being required for spatial navigation, episodic learning and memory retrieval [168], new neurons generated by AHN also regulate mood and in particular psychological resilience (resistance to stress) by controlling the HPA-axis either directly [169] or indirectly [170, 171]. If AHN is disturbed, corticosterone levels are slower to recover to baseline following moderate stress, and the HPA-axis is less suppressed by dexamethasone, showing impaired HPA-axis feedback, in older mice that experimentally lack new adult-born DG-neurons. This decisive involvement of active AHN in regulating psychological resilience is further evidenced by the observation that antidepressants like fluoxetine, member of the selective serotonin reuptake inhibitor (SSRI) class, were shown to require an intact hippocampal neurogenic niche and a productive AHN to exert their antidepressant effect [172]. This antidepressant mechanism was identified in rodents and confirmed in non-human primates [173]. Similarly, a recent human study reported a significant increase in hippocampal volume due to antidepressant treatment [174]. Conversely, impairment of AHN leads to anxiety (inhibition of a novelty-seeking behaviour), major depression [175] and chronically elevated cortisol, all three distinctive warning signs of early AD and potential causative risk factors [176, 177]. Interestingly, HPA-axis dysregulation was shown to occur at least as early as at the MCI stage of AD and to accelerate disease progression [178]. Furthermore, a reanalysis of the data of the Framingham Heart study found that depression at old age develops prior to cognitive decline and was a significant risk factor for dementia and AD [179].

Another hint that disturbed HPA-axis regulation caused by a chronically disturbed AHN might lead to AD, is the finding that early-stage AD patients consistently show increased basal plasma cortisol levels [180] and also decreased sensitivity to low-dose DEX suppression [181]. For this reason, cortisol measurements have been suggested as a reliable AD biomarker [182]. Further evidence, that one the earliest pathogenic events in AD might be caused by an inefficient AHN came from magnetic resonance imaging (MRI) studies: hippocampal atrophy more so than Aβ measures predicted the time-to-progression from MCI to AD [183].

Further support for the hypothesis that HPA axis dysregulation might be an important causal contributor to memory decline in AD comes from the recent prospective Washington/Hamilton Heights Inwood Columbia Aging Project, whose findings clearly revealed that depressive symptoms in late life precede memory decline, but not vice versa [184]. In addition, higher scores on the depression measures predicted steeper cognitive decline even among individuals whose cognition was not pathologically altered at baseline, independent of age, sex, education and illness burden including vascular disease. This observation is in line with evidence of former studies that showed that co-morbidity with depression is associated with a greater extent and progression of AD pathology, such as increased neurofibrillary tangle load [185] and faster rate of cognitive decline [186].

Chronic HPA axis dysregulation promotes AD pathology

Acute stress leads to immediate physiological changes that promote an increase in sensory input. While heightened senses in life threatening situations are important for survival, these alterations harbour the risk of information overload and hippocampal neurotoxicity due to excessive excitatory glutamate release [187, 188]. Hence it appears to be a useful adaptation that stress-induced HPA-axis upregulation concomitantly increases the production of hippocampal Aβ [189], which acts, as discussed above, as regulator of glutamate release. Furthermore, Aβ monomers activate the phosphatidylinositol-3-kinase pathway, thereby inhibiting apoptosis and protecting neurons from excitotoxic death [190].,Monomeric Aβ was also shown to be neuroprotective by increasing neuronal activity-dependent glucose uptake through activation of the type-1 IGF-1-receptor, which is vital for maintaining neuronal glucose homeostasis [191]. In addition, monomeric Aβ protects neurons against oxidative stress in situations of high energy metabolism [192, 193].

How does activation of the HPA axis increase synaptic Aβ release? One recently identified mechanism is up-regulation of amylogenic γ-secretase activity by the stress-response neuropeptide corticotropin releasing factor (CRF) [194]. Moreover, experimental administration of glucocorticoids was found to increase hippocampal Aβ concentrations in aged nonhuman primates due to reduced production of insulin-degrading enzyme, which is known to degrade excess Aβ [195]. Further in vitro and in vivo experiments provided evidence that glucocorticoid treatment also enhances Aβ production by increasing steady-state levels of amyloid precursor protein (APP) and Aβ cleaving enzymes [196]. Interestingly, these experiments also showed that glucocorticoids augment tau accumulation, indicating that this hormone, if chronically active, might accelerate the development of neurofibrillary tangles. The authors conclude that notoriously high levels of glucocorticoids, found in AD patients, might not merely be a consequence of the disease process but rather play a central role in the development and progression of AD. In support of this idea, chronic hypersecretion of cortisol in patients with lifetime major depression results in significant accumulation of Aβ in the brain, even in the absence of MCI or AD symptoms [197].

Taken together, while enhanced release of hippocampal Aβ in acute stress situations appears to be a useful neuroprotective mechanism, this up-regulation might turn into a problem, when chronic stress or a chronically disturbed HPA axis regulation due to an impaired AHN leads to reduced psychological resilience (weakened stress resistance) and chronic cortisol hypersecretion. Hence the question arises, how our organism sustains a lifelong productive AHN and functioning HPA axis regulation.

Key requirements for productive AHN and the law of the minimum (LOM)

Since its discovery in animals and particularly in humans, numerous studies revealed that AHN is a highly regulated phenomenon, which is under the control of local factors (the neurogenic niche), cytokines, growth factors and many hormones [198, 199], most of which are directly or indirectly controlled by behavioural or environmental cues. The study of food-storing [200] and song-learning birds [201] revealed the phenomenon of seasonal growth of functionally hippocampus-like brain structures and that AHN and behaviour (e.g. food-storing or song-learning) are dynamically interrelated [202]. This type of research has been extended to mammals, in which it was shown that the hippocampal size for example in kangaroo rats (Dipodomys) tightly depends on their natural space-use pattern [203], or that photoperiodic organisms like white-footed mice (Peromyscus leucopus) monitor environmental day length to engage in seasonally appropriate adaptions in physiology and behaviour, which is preceded by adjustment in brain volume via AHN [204]. This tight environmental and behavioural control of AHN makes sense from an evolutionary point of view, since brain size corresponds to energy expenditure. Hence the size of the hippocampus either grows or shrinks, depending on the individual’s vital need for memory capacity. Since our identity and the principles of the GMH depend on our lifelong ability to remember, AHN in humans has the propensity not only for seasonal but rather lifelong growth.

What are the environmental and behavioural cues and key requirements for lifelong productive AHN? Since mankind survived for the largest part of its history as hunter-gatherers, it is reasonable to assume that the complex regulation of a productive AHN was strongly adapted to the principle conditions, under which mankind survived for hundreds of thousands of years. From an evolutionary perspective, survival of the tribe depended on memorizing the foraging ground under conditions of physical activity and hunger until hunting or gathering was successful, whereby social bonding (hunting in groups) and learning from elders (for instance hunting strategies) were critically important. It is therefore of no surprise that social and physical activity, daily new challenges that provide eustress, IMF interrupted by ingestion of fresh and highly variable food, rich in essential nutrients, are all positive regulators of AHN (see below). Hence the basic requirements for a productive neurogenesis were naturally provided for by the pre-modern way of life. In contrast and as will be detailed below, lack of physical exercise, loneliness (actual or perceived), chronic distress or an unhealthy Western diet (WeDi) combined with a constant (ad libitum) eating pattern are negative regulators of AHN, with the detrimental consequences outlined above.

Starting from the proliferation of the neuronal stem cells, expansion of progenitor cells, followed by their growth and maturation (branching and synaptogenesis) to the final integration into the existing DG network takes about three month, maybe even longer [198, 205]. Besides the fact, that proliferation itself is tightly controlled, requiring both activation and removal from inhibitory signals, only a single digit percentage of newborn progenitor cells become mature, integrated DG-neurons, with most of the cells dying early [206, 207]. AHN requires at each step specific environmental cues (usually transmitted through hormones), but also specific brain building-blocks (essential polyunsaturated fatty acids, cholesterol etc.), social input (new memorable life experiences) and energy. Since each of these factors might individually limit the AHN, the “law of the minimum” applies. This is an important aspect of the proposed UTAD, as this “law” allows us to understand, how AD can be effectively prevented and treated, and why monotherapeutic interventions might continue to fail.

The principle that productive growth is restricted by limiting factors was first formulated in agricultural science by Carl Sprengel in 1828 [208] and was extended and popularized by Justus von Liebig as the law of the minimum (LOM) in 1840 (for references and history of the Sprengel-Liebig LOM see [209]). The LOM states that growth is controlled not by the total amount of resources available, but by the scarcest resource. This is a very important notion if we regard AHN for what it is: a growth process. The LOM explains how a deficiency in any essential growth or maturation factor cannot, by definition (since being essential), be compensated by another. Hence the lack of the respective behavioural or environmental cues or other essential growth factors, which limits the proliferation, growth or maturation of new DG-neurons, must inevitably lead to a disturbed AHN, with the consequences of cortisol hypersecretion, depression and an increased risk of developing AD.

Epidemiological studies have identified many risk factors for AD as well as for major depression, and many of those are limiting and essential factors for AHN according to the LOM, and most if not all of them are behavioural and therefore modifiable (see also Fig. 1 “requirements” for, and Fig. 2 “deficits” inhibiting a productive AHN). Since the dynamics of hippocampal growth include the hippocampus’s potential to shrink, each factor missing not only hinders a productive AHN, but also leads to a reduced hippocampal size and increased risk for depression and AD. One important prediction emanates from the LOM when applied to a potential multi-causal disease like AD, where individual lifestyles usually lead to several deficiencies, intervention trials, which arbitrarily try to eliminate one or only a limited number of such deficits (i.e. risk factors), can have only weak effects, because only a fraction of the study population will have a deficit only (!) in this (or these) particular factor(s) and therefore benefits from the respective intervention. Following is a commented list of requirements for a productive AHN, many of which are usually in deficit in economically advanced societies (see also Figs. 1 and 2):

Fig. 1 “Natural mechanisms preserve lifelong mental health”. Our genetic program is well adapted to the lifestyle conditions that dominated for the largest part of humans’ life history. Ancient lifestyle was marked by extensive and daily physical activity, alternated phases of fasting and a nutrient-rich diet, sufficient sleep and eustress. Furthermore, survival was critically dependent on extended family bonds, hence went along with a rich social life. The exceptional long postmenopausal period served the purpose of transgenerational generativity, which, according to the grandmother hypothesis, provides the main current explanation for humans’ longevity. Therefore, under these natural conditions, all key physiological systems (immune and cardiovascular function, energy metabolism etc.) positively interact and thereby support key mechanisms like neuronal rejuvenation and adult hippocampal neurogenesis in order to allow the lifelong acquisition of knowledge and to preserve mental health up to high age. For details, please refer to the main text Full size image

Fig. 2 “AD is a deficiency disease”. Our genetic program is not adapted to the fast and very recent changes that define our modern lifestyle, which connotes individual combinations of physical inactivity, an ad libitum eating pattern of a nutrient-poor diet, chronic distress emanating from the demands of a highly competitive labour situation, which often goes along with a loss of extended family bonds. Furthermore and under such conditions, the concept of retirement, a late invention in humans’ cultural history, counteracts the main purpose in late life from an evolutionary point of view : A lack in transgenerational generativity leads to a devastating lack of purpose in life. Deficiencies in essential requirements for mental health cannot, by definition, be compensated by our genetic program. Consequently, as defined by the law of the minimum, individual deficits hamper neuronal rejuvenation, and in particular hinder productive AHN. As the neuronal correlate of depression, the disturbed HPA axis regulation and cortisol hypersecretion as well as other pathophysiological consequences (neuroinflammation and breakdown of the blood brain barrier, insulin resistance, hypertension and arteriosclerosis) emanate from these lifestyle-derived deficits and lead to an accumulation neurotoxic Aβ, hippocampal shrinkage in particular, and brain atrophy in general, hence the well-known hallmarks of AD. Under these conditions of behavioural deficiencies, environmental toxins, chronic infections and genetic predisposition accelerate AD progression. The indicated interactions between the different pathological processes activate a multitude of vicious cycles that make the AD-process a runaway phenomenon, which can only be stopped and reversed to the situation depicted in Fig. 1 by a systems biological approach, which is outlined in the main text and schematically presented in Fig. 3 Full size image

Moderate but extended physical activity increases brain volume even in aging humans [210]. This makes a lot of sense from an evolutionary point of view: Physical activity was a prerequisite for survival in pre-modern societies and hormonal signals from the working body are used as a mechanism to enhance not only fitness but and also autobiographic memory capacity (see below). In a sense, being physically active signals to the brain that new experiences are to be expected: The further one walks the more one will experience and the larger one’s hippocampal memory index has to grow. Nowadays, we have the choice to run or not to run, and most of us embrace effort-sparing technologies and lead sedentary lifestyles. The rise of physical inactivity contributes enormously to the overall quality of life by diminishing strength, the ability to perform daily chores and social interactions, mobility and cognitive performance.

In a well-controlled randomized study, 120 older adults without dementia were randomly assigned either to an aerobic exercise group (n = 60, average age 67, 6 years) or to a stretching control group (n = 60, average age 65,5 years) [211]. The groups did not differ in hippocampal baseline volume as measured by MRI. Besides being healthy, a prerequisite to be part of the program was a sedentary lifestyle, defined as being physically active for 30 min or less daily in the last 6 month before the intervention. After an initial 6 weeks training phase, the aerobic walking group reached a level of 40 min walking daily with a target heart-rate zone of 60–75 of the maximum heart rate reserve, the stretching and toning control group exercised with dumbbells or resistance bands, trained to improve balance including a yoga sequence. After the one-year intervention, the aerobic exercise group showed an increase in hippocampal volume of about 2 %, counteracting the usual hippocampal shrinkage in volume of 1–2 % annually in older adults even without dementia [212]. Such a loss of volume increases the risk for developing cognitive impairment as outlined above. The stretching control group fitted this pattern of age-related loss and demonstrated a 1.4 % decline in volume over the one-year interval, indicating that aerobic exercise is a requirement for brain health at old age, and stretching is not sufficient. That long-term mild aerobic exercise is superior in enhancing AHN has also been shown in rodents [213]. In fact, short bouts of even intense physical activity had no positive effect on AHN [214].

Interestingly, the individual improvements in aerobic fitness of members in the walking group strongly correlated with the increase in hippocampal volume. Growth occurred mainly in the anterior section, which includes the DG. This is a significant finding as cells in the anterior hippocampus mediates acquisition of spatial memory as well as verbal memory performance [215] and is more prone to age-related atrophy compared with the posterior hippocampus [216]. Hence the positive effect was directly and significantly related to improvements in memory performance, indicating that increases in hippocampal volume due to aerobic exercise augments memory function even in late adulthood.

Changes in physical activity have also important consequences for health in general and lifespan. In line with the GMH, an unnatural sedentary behaviour in 70-year-old men was shown to reduce the probability of survival up to the age of 90 from 54 to 44 % [217]. Physical inactivity leads to many so-called “lifestyle diseases”, which are sometimes rather inappropriately named diseases of longevity, as if age were the cause rather than a chronically unhealthy lifestyle [218]. Declining levels of expression of neurotrophic factors level such as BDNF, nerve growth factor (NGF) and glial cell-derived neurotrophic factor (GDNF) are strongly implicated in aging and AD [219]. But correlation with age does not equal causality [220, 221]. Rather, all these hormonal factors are positively regulated by exercise, explaining the accelerated brain aging is rather caused by a sedentary lifestyle in the elderly [222], and therefore appears to have little to do with aging per se.

This direction of causality, i.e. from lifestyle to brain “degeneration” over the years, was tested in animal experiments. In one setting, downregulation of GDNF in sedentary triple-transgenic AD mice was reversed by voluntary exercise and improved synaptic functioning [223]. This particular AD-model triple-transgenic model (3 × Tg-AD) harbours the PS1M146V (a presenilin 1 mutant), APPSwe (a β-amyloid precursor protein mutant), and tauP301L (a p-tau mutant) transgenes and was specifically developed for evaluating the impact potential AD therapeutics of both amyloid plaque and p-tau tangle pathology on synaptic function [42]. In another experiment, NGF was also up-regulated by physical exercise and shown to stimulate AHN [224]. Furthermore, BDNF was shown to promote differentiation and maturation of adult-born DG-neurons [225] and was being released from the hippocampus upon exercise [226].

In the above outlined aerobic walking intervention study, increased hippocampal volume in the exercise group was also associated with increased levels of serum BDNF, which were shown, at least in rats and pig, to parallel hippocampal BDNF concentrations [227]. The observation that BDNF secretion in the hippocampus can be induced by exercise is an important finding as this member of the nerve growth factor family was found to be reduced in the hippocampus and the temporal cortex of AD patients [228]. Another intervention study in humans provided further evidence that hippocampal volume loss, which was significantly and directly associated with poor memory performance, could be counteracted by moderate-intensity exercise [229]. Interestingly, both exercise-induced erythropoietin (EPO) and vascular endothelial growth factor (VEGF), released in reaction to general or local decrease in oxygen concentration, respectively, enhance AHN. VEGF was even shown to be necessary for exercise-induced AHN [230, 231]. EPO was found to enhance hippocampus-dependent memory by modulating plasticity, synaptic connectivity and activity of memory-related neuronal networks [232]. It also activates AHN [233, 234] and protects hippocampal neurons through increasing the expression of BDNF [235].

The importance of physical activity in the prevention of chronic diseases and in particular AD is also exemplified by BDNF’s ability to stimulate both, PGC-1α-dependent mitochondrial biogenesis in hippocampal neurons and promotion of synapse formation and maintenance [236]. Activating PGC-1α through exercise also plays a key role in inhibiting AD-driving neuroinflammation [237]. The significance of physical exercise for the brain’s health is further substantiated by the fact that evermore hormonal signalling pathways are being identified, which ensure that physical activity stimulates AHN and improves memory function. In addition to BDNF, NGF, GDNF, EPO and VEGF, mentioned above, physical exercise was found to stimulate AHN through dihydrotestosterone [238], which, not surprisingly, mediates the well-known antidepressant effects of physical activity [239]. Direct support of a link between androgen activity and AD came from a recent study that provided evidence that androgen-deprivation as part of a prostate-cancer treatment doubled the risk of developing AD [240].

GH [241] and IGF-1 [242] also respond to repeated bouts of aerobic exercise, stimulate AHN and are neuroprotective. Fibroblast growth factor 2 (FGF-2) is also induced by physical exercise directly in the hippocampus [243] and was shown to restoring hippocampal function in murine models of AD [244]. FGF-2 is even discussed as a potential treatment option for AD [245], although, as the LOM predicts, single therapeutic measures have a low probability to be successful in curing AD. Furthermore, the central release of serotonin is required for physical activity-dependent AHN, where it also plays a direct and acute regulatory role in young adult as well as in aged mice [246]. This finding led to the conclusion that the understanding of exercise-induced AHN might offer preventive but also therapeutic opportunities in depression and age-related cognitive decline. Similarly, fat-cell secreted adiponectin appears to mediate physical exercise-induced AHN [247], thereby acting as an antidepressant [248]. Interestingly, some hormones or cytokines are secreted directly by the working muscle itself, which therefore can be considered as an endocrine organ [82]. The AHN-promoting hormones irisin [249] and meteorin-like [250] appear to be released solely by the active muscle.

Last but not least, the effects of physical activity on health and memory are also mediated by moderate levels of cortisol [251]. Whereas the positive effects of physical exercise peak at upper-middle intensity levels and thereby activating AHN, excessive cortisol levels due to extreme physical workout are inhibiting AHN [252]. One mechanism besides psychological stress that often accompanies extreme physical activity might be the increased release of pro-inflammatory cytokines like IL-6 from strained muscles that activate adrenocorticotropic hormone (ACTH) secretion from the pituitary gland, which, in turn, induces high levels of cortisol secretion [253]. This cortisol hypersecretion was found to aggravate neuroinflammation [254]. In contrast, regular and moderate physical exercise reduces systemic inflammation [255]. Similarly, while moderate physical activity enhances antioxidant defence mechanisms, intense levels of physical activity deplete the antioxidant reserve [256]. Therefore, extreme levels of physical exercise should be omitted as they are leading to adverse neurological effects [257]. Hence in order to maintain physical and mental health, a balanced life style with frequent workouts at moderate intensity is ideal, while physical inactivity (as well as over-activity) needs to be omitted whenever possible.

Besides the insufficient activation of AHN [258] that initiates the above outlined detrimental causal chain of events, physical inactivity directly increases the risk of developing AD by reducing the clearance of excess Aβ. A sedentary lifestyle was found to downregulate low density lipoprotein receptor-related protein 1 (LRP1), the major export protein for Aβ [259], located in the BBB. Physical inactivity also inhibits the activation of neprilysin, one of the main Aβ-degrading enzymes [260]. In addition, being sedentary leads to persistent sterile neuroinflammation [237], insulin-resistance, diabetes mellitus and visceral obesity. According to a recent study, these conditions all add to individual risk for developing AD [261].

Important for therapeutic considerations, becoming physically active even helps when AD is already diagnosed. A recent pilot study investigated the effect of a home-based physical activity intervention program on AD-patients on the development of clinical symptoms and functional abilities and the effects on family caregiver burden after 12 and 24 weeks [262]. While the (sedentary) control group experienced decreases in the ability of daily living performances, the patients in the (physically active) intervention group remained cognitively stable. Analyses of executive function and language ability revealed considerable positive effects for semantic word fluency in the intervention group. In fact, the physical active patients improved during the intervention, whereas the controls revealed continuous worsening. Consequently, caregiver burden remained stable in the intervention group but worsened in the control group.

AS the LOM tells us, physical exercise is not enough to effectively prevent or cure AD. Nevertheless, in most studies, the LOM was ignored and in some studies, in addition, the amount of daily exercise of the intervention group was very low, which led the researchers to devalue exercise for the prevention of AD [263]. Unfortunately, studies of this type, which are scientific standard (only one variable should be altered), lead to misleading results and their popularisation potentially undermine the public understanding regarding the importance of a healthy lifestyle. But being physically active is just one natural (and from an evolutionary perspective obvious) requirement that needs to be fulfilled. Another is the steady supply with essential nutrients.

Nutritional effects on AHN

There is a long list of nutrients that are essential building blocks for (adult) neurogenesis and synaptogenesis (for review, see [264]) and which are also important for NRJ. For instance, n-3 polyunsaturated fatty acids (PUFAs), flavonoids, antioxidant-rich berries and resveratrol, a polyphenol found in red grapes and other fruits, were all shown to stimulate AHN [265]. In addition, they reduce oxidative stress and down-regulate pro-inflammatory processes (for review, see [266]); some even have potent anti-amyloidogenic properties [267, 268].

In contrast, a typical WeDi with its high intake of red and processed meats, animal fat, refined grains and sweets is low in essential nutrients and when simulated in animal studies, markedly reduces brain levels of neurotrophins such as BDNF, hampers neuronal plasticity and learning [269]. This type of modern diet also leads to increased ROS levels and neuroinflammation, which also hampers AHN and dysregulates the HPA axis by chronic cortisol hypersecretion [270]. There is also considerable evidence that advanced glycation end products (AGEs) might play a role in AD [271]. Sugar, fructose, corn syrup and generally food including beverages with high glycemic load cause intermittent hyperglycaemic episodes. This contributes to the endogenous generation of AGEs. In addition, meat products, particularly from industrial livestock farming, provide a large amount of exogenous AGEs (for review, see [272]). AGE (not age!)-induced neuroinflammatory processes are mediated by inflammatory cytokines like interleukin-1 (Il-1) [273], interleukin-6 (IL-6) [274] or tumour necrosis factor-alpha (TNF-α) [275] and aggravate neurodegenerative processes and were shown to further decrease productive AHN [276].

Intensifying the problem of chronic neuroinflammation caused by AGEs, is the high intake of omega-6 (n-6 PUFAs) in absolute, but also in relative quantities to n-3 PUFAs, whose intake is typically low in a typical WeDi [277]. In general, chemokines derived from the n-6 PUFA arachidonic acid (AA) are pro-inflammatory, while those derived from n-3 PUFAs, like for instance docosahexaenoic acid (DHA), have anti-inflammatory properties [278]. Pro- and anti-inflammatory processes are essential for wound repair and healing, respectively, but require a balanced intake of both types of PUFAs. But with the advance of industrially produced food, there has been a dramatic change from an healthy (and “natural”) n-6 PUFA to n-3 PUFA ratio of about 1 to 1 in pre-modern diet to ratios up to 20 to 1 in current WeDi [279]. One reason for this development is the widespread use of vegetable oils with high a high concentration of n-6 PUFAs, i.e. linoleic acid like e.g. sunflower oil (contains up to 70 %) and corn oil (up to 60 %). The use of these polyunsaturated oils was propagated by the food industry for their serum cholesterol-lowering properties, but recent study based on available evidence from randomized controlled trials shows that replacement of saturated fat in the diet with linoleic acid actually increased the risk of death from coronary heart disease or all causes by 22 % for each 30 mg/dL (0.78 mmol/L) reduction in serum cholesterol [280]. A healthy alternative would be the use of extra-virgin olive oil [281–283], as well as the moderate use of organically grown canola [284] or flaxseed oil [285], with healthier n-6 PUFA to n-3 PUFA ratios.

Another reason for the increase of n-6 PUFAs in our modern diet results from the intense consumption of meat and high fat milk products. The fact that those products are derived from intensive animal farming aggravates the negative effect on consumer’s health, since the unnatural “way of life” of those animals negatively impacts the n-6 PUFA to n-3 PUFA ratio in their products [286, 287]. Reason might be food for fattening, a lack of exercise and stress of the farmed animals. Last but not least, since the conversion of plant-derived n-3 PUFAs to DHA (or eicosapentaenoic acid (EPA), another physiologically important n-3 PUFA product, is a very inefficient metabolic process in humans, the low consumption of fish (the major source of animal DHA and EPA) in a WeDi leads to an absolute deficit in the supply of these two n-3 PUFAs. In addition, the high intake of n-6 PUFAs further compromises the already low metabolic conversion rate of plant-derived n-3 PUFAs to DHA and EPA [288]. The resulting serious imbalance of n-6 to n-3 PUFAs drives premature aging, neuroinflammatory processes, depression and AD [289, 290].

This development is unfortunate, since high concentrations of DHA would inhibit lipid peroxidation [291], and, together with EPA, would reduce brain inflammation and cognitive impairment [292]. Furthermore, many different neuroprotecting cellular mediators derive from DHA, EPA as well as another important n-3 PUFA, docosapentaenoic acid (DPA) (for review see [293]). For example, DHA-derived neuroprotectin D1 (NPD1) induces signalling for homeostatic maintenance of cellular integrity, and is neuroprotective due to its ability to inactivate pro-apoptotic and pro-inflammatory signalling pathways [294, 295]. Furthermore, NPD1 is a mediator with proven anti-amyloidogenic bioactivity [296], which was originally ascribed to DHA [297]. DHA was shown to positively regulate BDNF [298] and GDNF [299], thereby acting synergistically to enhance AHN and synaptic growth. In addition, PUFAs like AA and DHA are essential building blocks for all neuronal tissue, making up roughly 30 % of our brains membrane lipids [300]. Taken together, a diet deficient of DHA limits neurogenesis, making its supply critical for healthy brain development and maintenance of neural plasticity in later life [301]. Supporting these observations is the recent finding, that lower intakes of nutrient-dense foods and higher intakes of unhealthy foods, typical for a WeDi, were each found to independently being associated with smaller left hippocampal volume in humans [302]. In contrast, supplementation of about 1 g per day of a mixture with n-3 PUFAs DHA and EPA significantly improved episodic memory outcomes in older adults with mild memory complaints [303].

It is reasonable to assume that at least for the hundreds of thousand years of human evolution before the advent of the so-called “modern life style” beginning in the 19th century, mankind was used to a diet with sufficient and roughly equal quantities of n-3 PUFAs to n-6 PUFAs (for review see [297]). It might be not only a mere coincidence that the change to a marine food diet which provided a stable and continuous supply of essential DHA (and EPA) sources appear to happen together with advancements in human’s cultural development, as exemplified by the bladelet stone tool technology and other cultural advancements [304]. Indeed, there is also now incontrovertible support from fossil evidence for the hypothesis that steady access to the marine food web must have played a critical role in human’s final steps in evolution [305]. This might have allowed humans to evolve big brains even without the ability to efficiently convert plant-derived n-3 PUFAs into DHA and EPA. Hence, humans are not at all adapted to the very recent and rapid change in dietary habits that cause a detrimental deficit in DHA as well as EPA.

While a lack of essential n-3 PUFA is causing neuroinflammation and a disturbed AHN (wherefore supplements containing EPA and DHA were shown to have a positive effect on primary depression [306]), there is also substantial convergence of research findings to date indicating that seafood consumption reduces the risk of dementia and AD [307].

However, not all researchers agree with this, since a major breakthrough in AD prevention by adding fish to patients’ diet has not been achieved (for a list of trials see [289]) and many therefore argue for longer-term human trials in hope for better success rates [308]. Notwithstanding, one should be aware that only those participants of any cohort might benefit from DHA treatment (or any other AHN-compromising deficit), who actually have a deficit in DHA (or any other respective deficit), and, according to the above outlined LOM, have no deficit in any other essential requirements for a productive AHN and NRJ. Hence, even if longer intervention trials might not raise the success rates, as long as we do not rectify all concomitant deficits. In any case, one should not conclude that providing sufficient DHA is not essential, simply because in most published cases it has on its own not been effective in preventing or treating AD (or major depression).

One key question however remains: Where should a steady world supply with DHA (and EPA) come from? With regards to meeting the assumed global consumption needs, there is a growing concern about the sustainability of global fisheries. An economic and ecological alternative to selectively consuming wild caught and organic farmed fish is to increase the supply of edible oil rich in DHA and EPA derived from of organically grown micro-algae [309]. This would also circumvent the problem of the contamination with methylmercury (MeHg) in fish (see below).

Mediterranean diet (MeDi)

A healthy alternative to the standard WeDi is the MeDi [310]. Since all Mediterranean countries have their own type of cuisine, one has to be aware that what we call MeDi is an artificial construct. Nevertheless, the principal aspects of this diet include proportionally high consumption of olive oil, legumes, unrefined cereals, fruits and nuts, and vegetables, moderate to high consumption of fish, restrained consumption of dairy products, modest red wine consumption, and low consumption of non-fish meat and non-fish meat products [311]. Older adults, adhering to a MeDi, have less brain atrophy, with an effect similar to 5 years of aging, when compared to those adhering to a WeDi, according to a recent study [312]. Particularly higher fish and lower meat intake might be the two key food elements that contributed to the benefits of MeDi on brain structure, as the authors of the study pointed out. Another study provided evidence that such dietary interventions play a role in the prevention of AD. Here, individuals without cognitive deficits showing lower adherence to a MeDi had cortical thinning in the same brain regions as clinical AD patients, i.e. entorhinal cortex, orbito-frontal cortex, inferior parietal lobule, inferior and middle temporal cortex and posterior cingulate cortex, compared to those showing higher adherence [57]. Interestingly, ApoE4-subjects showing higher MeDi adherence had the greatest benefit of all other subgroups, which is in line with the aforementioned role of ApoE4 being primarily an accelerator of the AD process under unhealthy lifestyle conditions. A similar protective effect was found for the traditional Japanese diet (for review see [313]), which might explain at least partially the low AD prevalence rates in the old Japan, as discussed above.

Importantly, the benefits of the MeDi, like for any diet protecting against chronic disorders, should not only be attributed to its products. They are also strongly influenced by the food processing techniques and the style of cooking. Mechanistic and epidemiological evidence provide convincing evidence that food processing impacts on the quality of the inherent phytochemicals and this in turn impacts on the protective properties of these foods against chronic diseases associated with inflammation (for review see [272]).

Vitamins

A WeDi is generally low in vital nutrients and leads to a reduced intake of vitamins and essential trace elements. In line with the LOM, each individual deficit inhibits productive AHN and was shown to increase the risk of AD, including deficits of vitamin A [314], all the B-complex vitamins [315], vitamin B1 [316, 317], B3 [318, 319], B6, B9 and B12 (see below), C (for a comprehensive review see [320]), D (see below and [321]), and the E tocopherols [322]. As it is beyond the scope of this review to detail the impact of each deficit on AHN and resulting risk for depression and AD, I will provide only two examples. Nevertheless, in AD prevention (and therapy), each deficit needs to be addressed.

Example 1

Nutritional deficiencies in the either of the vitamins B6, B9 or B12 lead to an increased accumulation of homocysteine, an intermediate product in the L-methionine and L-cysteine amino acid metabolism, where these three vitamins are essential cofactors. An elevated homocysteine blood level is a predictor of cognitive decline in AD [323]. Homocysteine induces oxidative stress through neuronal nitric oxide synthase activation and free radical formation [324], thus causing a significant increase in neuronal cell death [325]. Furthermore, elevated homocysteine was shown to be detrimental to AHN [326], where it inhibits the proliferation of neuronal precursors by interfering with several signalling pathways required for cell proliferation [327]. A recent 10-year study based on post-mortem neuropathological and MRI findings identified that elevated homocysteine was related to Aβ accumulation and brain atrophy. The highest homocysteine quartile had an odds ratio of 3.78 for medial temporal atrophy and 4.69 for periventricular white matter hyperintensities [328]. Since all associations were independent of several potential confounders, including common vascular risk factors, deficiencies of either vitamin B6, B9 or B12 should therefore be regarded as independent risk factors for AD due to homocysteine elevation.

However, despite the fact that lowering homocysteine-levels by supplementing B vitamins was shown to significantly slow-down the rate of accelerated brain atrophy in MCI (from 1.08 % per year in the placebo group to 0.76 % in the active treatment group) [329], no full prevention nor cure of AD can be expected, according to the LOM, as long as other essential risk factors are not eliminated as well. Only a preventive and therapeutic scheme that provides all essentials, as schematically outlined in Fig. 1, might be successful. Therefore, management of potential micronutrients deficiencies can only be one, albeit important, aspect of AD prevention and treatment strategies (for review see [313, 330]).

Example 2

Vitamin D deficiency is known to decrease bone density and increase the risk for many common forms of cancer [331] as well as cognitive impairment in older but also younger adults [332]. The overall prevalence rate of vitamin D deficiency is widespread. For instance, it averages in the USA at about 41.6 %, with the highest rate seen in blacks (82.1 %), followed by Hispanics (69.2 %), according to a recent study [333]. Vitamin D deficiency causes abnormal AHN [334]. While the results of a recent meta-analysis analyses were consistent with the hypothesis that low vitamin D concentration is associated with depression (taken as a sentinel disease for an unproductive AHN), there is always the call for more randomised controlled trials to find out for instance, if vitamin D supplementation could indeed be useful for the prevention and treatment of depression [335] or cognitive decline [336]. I understand that there is a strong scientific need to understand better how Vitamin D insufficiency dysregulates certain nerve growth factors and respective signalling pathways [337], but waiting for more scientific understanding what goes wrong when an essential vitamin is missing does not help those who suffer right now under such a deficiency and are prone to develop serious diseases. Furthermore and again, such interventions do not help those included in trials that have sufficient serum vitamin D at baseline [338]. Second and as stated above, only those participants benefit fully of vitamin D supplementation when at the same time all other essential factors, that are missing and which are required for a productive AHN are provided as well. Since this is rarely done, a weak or negative outcome regarding AD prevention should not lead to the false conclusion that the deficit in question is not casually linked to AD and should be corrected.

Indeed, the results from a recent study confirm that vitamin D deficiency is strongly associated with a substantially increased risk of all-cause dementia and AD [339]. Conversely, higher dietary intake of vitamin D was shown to be associated with a significant lower risk of AD [340]. Furthermore, a recent study provided evidence that serum 25-Hydroxyvitamin D levels of 70 nmol/L were associated with the lowest cardiovascular disease mortality risk [341], close to the values that showed an over two-fold reduced brain atrophy (>50 nmol/L), when compared to insufficient vitamin D levels [339]. In my opinion, all the results taken together warrants immediate action of primary care physicians but also policy makers to make the screening for deficits in al