The unfortunate influence of the weather on the rate of

1
Gerontology, in press
The unfortunate influence of the weather on the rate of
aging: why human caloric restriction or its emulation may
only extend life expectancy by 2-3 years
Aubrey D.N.J. de Grey
Department of Genetics, University of Cambridge
Running head: the weather and the rate of aging
Key words: life expectancy, caloric restriction, nutrient sensing, life-extension drugs
Postal address: Dr. Aubrey D.N.J. de Grey, Department of Genetics, University of Cambridge, Downing
Street, Cambridge CB2 3EH, UK
Tel: +44 1223 765665
Fax: +44 1223 333992
Email: ag24@gen.cam.ac.uk
Abstract
Much research interest, and recently even commercial interest, has been predicated on the assumption that
reasonably closely-related species – humans and mice, for example – should in principle respond to
aging-retarding interventions with an increase in maximum lifespan roughly proportional to their control
lifespan (that without the intervention). Here it is argued that the best-studied life-extending
manipulations of mice are examples of a category that is highly unlikely to follow this rule, and more
likely to exhibit only a similar absolute increase in maximum lifespan from one species to the next,
independent of the species’ control lifespan. That category – reduction in dietary calories or in the
organism’s ability to metabolise or sense them – is widely recognised to extend lifespan as an
evolutionary adaptation to transient starvation in the wild, a situation which alters the organism’s optimal
partitioning of resources between maintenance and reproduction. What has been generally overlooked is
that the extent of the evolutionary pressure to maintain adaptability to a given duration of starvation
varies with the frequency of that duration, something which is – certainly for terrestrial animals, and less
directly for others – determined principally by the weather. The pattern of starvation that the weather
imposes is suggested here to be of a sort that will tend to cause all terrestrial animals, even those as far
apart phylogenetically as nematodes and mice, to possess the ability to live a similar maximum absolute
(rather than proportional) amount longer when food is short than when it is plentiful. This generalisation
is strikingly in line with available data, leading (given the increasing implausibility of further extending
human mean but not maximum lifespan in the industrialised world) to the biomedically and commercially
sobering conclusion that interventions which manipulate caloric intake or its sensing are unlikely ever to
confer more than two or three years’ increase in human mean or maximum lifespan at the most.
Introduction
The phenomenon of lifespan extension by caloric restriction has been studied in mice by gerontologists
for at least 70 years [1], and for at least the past 25 years has enjoyed intense research interest. [Note:
throughout this article, “lifespan” is used to denote the maximum lifespan or lifespan potential, which is
customarily defined as the mean longevity of the longest-living 10% of a given population.] A
development that has particularly spurred this line of work in the past decade is the rise of the nematode
2
worm, Caenorhabditis elegans, as a research organism in many areas of biology, including gerontology.
C. elegans also responds to starvation with a rise in lifespan. A reduction in food availability initiated at
the last larval moult extends lifespan by about 60% relative to “ad libitum” food density, very much the
same proportional extent as the maximum resulting from caloric restriction (CR) in mice [2].
C. elegans exhibits a much more dramatic phenotype, however, when subjected during early larval life to
a more severe (typically total) unavailability of calories. (It is not traditional to call this manipulation
“caloric restriction”, but that is a purely terminological convention which must not distract us from
comparing different degrees of deprivation of nutrients if such a comparison might prove illuminating.) It
enters a state known as dauer, a developmental arrest in which energy utilisation is drastically reduced
[3]. Dauer larvae can survive for three months without harm, as shown by resumption of development and
progression to fertile adulthood on reintroduction of food [4]. Moreover, this may underestimate their
potential longevity, for several reasons:
-
the measurement cited above [4] has apparently not been repeated under a range of conditions
such as liquid versus solid culture, food concentration prior to entering dauer, temperature, etc.;
-
a maximum lifespan of six months has been achieved by a combination of genetic and surgical
interventions whose gene expression effects substantially overlap the dauer profile [5];
-
these original experiments, like most using C. elegans, involved feeding standard E. coli, which
has been shown to reduce nematode lifespan relative to bacteria with a lipid composition more
similar to typical soil bacteria [6];
-
in the original study [4], post-dauer lifespan of individuals restored from dauer after two months
had the same remaining life expectancy as ones that had been in dauer for only five days, whereas
a truly maximal period in the dauer state should intuitively entail a certain amount of aging. This
indicates that dauers may simply starve to death rather than aging “only” 3-4 times more slowly
than fed worms.
Numerous other poikilotherm species are also capable of surviving complete absence of nutrients for
similar or even longer periods (Table 1). Conversely, this degree of starvation of mice is, of course,
rapidly fatal.
Organism
Manipulation
Max. life extension
Reference
C. elegans
Genetic and anatomical
5 months
5
Drosophila
Temperature, photoperiod
Several months to a year
7,8
Grasshopper
Desiccation
Several months
9
Mus musculus (relative
to non-obese controls)
53% CR
~14 months
10
Dog
25% CR
~14 months
11
Cow (highly tentative)
~40% CR
~12 months
12
Table 1. Maximum observed nutrient- or nutrient sensing-related life extension in a variety of taxa. The
life extension given is the greatest obtained by either direct caloric limitation or artificial manipulation of
nutrient sensing. The ages compared are maximum lifespan, defined as mean of the longest-lived 10%.
A traditional gerontological interpretation of the observations mentioned thus far is as follows:
3
-
Moderate CR induces a moderate increase in lifespan.
-
In simple organisms such as nematodes, metabolism is sufficiently plastic that a much more
severe reduction in food supply can be tolerated, by adopting a drastically altered metabolic state.
-
In higher animals, by contrast, the metabolic state is more tightly constrained by the structure and
function of various tissues, so no comparably fundamental metabolic shift is possible.
Teleologically speaking, evolution has come closer to “running out of ideas” to increase lifespan
of already long-lived species.
However, the description of such animals as “simple” is controversial: after all, all metazoans yet
sequenced have over half as many genes as mammals. Here it is argued that the above analysis is
evolutionarily oversimplistic, and that the true reason why mice lack the ability to treble (or more) their
lifespan in response to environmental vicissitudes is that they have not evolved in the presence of
adequate selective pressure to develop (or to retain) that ability. It is proposed that the major determinant
of the degree of this selective pressure is the frequency distribution of different durations and intensities
of starvation in nature, something that is governed principally by the weather (via the oscillating
abundance of vegetation) and is thus similar for all terrestrial animals. All such species “know” how to
live several months longer when starved than they do when food is plentiful; on this hypothesis, that is
because this is a useful facility often enough to be maintained during evolution. Conversely, it is proposed
that the reason no such species yet studied knows how to live five years longer than normal is that this is
too seldom useful to have been evolved or retained. Available data (Table 1) are consistent with this idea:
while some organisms have been studied much more thoroughly than others and future data may tell a
different story, the maximum absolute life extension elicitable in long-lived animals seems at present to
be only a little more than in short-lived ones. The ratio of absolute life extensions is far less than the ratio
of lifespans.
In this article, I first review some established concepts in the evolutionary biology of aging and thereby
highlight how it has become unquestioned that similar interventions should produce similar percentage
increases in lifespan – a belief which I term the “proportionality principle”. I then discuss why
interventions related to nutrient sensing or availability are likely, on purely evolutionary grounds, to
depart from that principle, and I propose a broadly quantitative prediction of that departure which is
strikingly in line with the data presented in Table 1. Then I address various potential objections to the
hypothesis, including ostensible counterexamples, and conclude by discussing its relevance to the
burgeoning search for pharmaceutical retardants of human aging.
Three types of evolutionary pressure not to age
Aging deceleration caused by reduced extrinsic mortality
In the 1950s, seminal articles by Medawar and Williams overturned a tenet of gerontological
conventional wisdom that had gone unchallenged for over half a century. Weismann had suggested in the
late 19th century [13] that aging exists because it facilitates natural selection: the presence of healthy but
aged individuals would hinder the ability of younger members of the species to compete with each other,
resulting in slower adaptation to the environment and consequently, in the long run, a competitive
disadvantage of the species as a whole relative to species whose older individuals were continually
removed from the population. Medawar noted [14] that this could not be right, because aging is virtually
unknown in natural environments: even species at the top of the food chain experience so high a rate of
mortality throughout their lives (from starvation, hypothermia and so on) that they almost never live to
ages at which, in a protected environment, they begin to show signs of declining function. He proposed
that the real reason aging exists in protected environments is because it has not been selected away in the
wild: traits that confer frailty only at ages when extrinsic mortality has done away with all individuals
anyway will experience no selective pressure for elimination during evolution. (This insight is often misstated as a low selective pressure to maintain the organism beyond its reproductive lifespan; while that is
4
true, it misses the point, because evolution typically extends a species's reproductive lifespan in tandem
with its total lifespan.) Five years later, Williams refined this idea [15] by pointing out that, occasionally,
genes or pathways would evolve which were beneficial to survival early in life but also deleterious at
advanced ages, so that aging might be partly driven by processes actively retained by natural selection.
These ideas were later further refined: in 1967, Edney and Gill proposed [16] that the rate of extrinsic
mortality (from predation, starvation, etc.) of a population would, in general, modulate its rate of intrinsic
mortality – that is, its rate of aging. This hypothesis has since been robustly confirmed (e.g., ref. 17).
Aging deceleration from pressure to preserve function more than a lifetime
A decade later, the evolutionary theory of organismal aging was extended into a theory of tissue aging.
Kirkwood’s "disposable soma" theory [18] notes that the organism must take much better care of its germ
line than of any other tissues, because the latter are of no relevance to the long-term survival of the
species once the organism's offspring have reached independence, whereas the germ line must propagate
undamaged indefinitely. Again, this is now clearly supported by the available evidence (e.g., ref. 19).
Aging deceleration from pressure to reproduce when offspring will survive
The topic that concerns us here is a third type of evolutionary influence on the rate of aging, as different
from both the concepts just outlined as they are from each other. This third response to the variation in
evolutionary pressure not to age concerns not variation between individuals (or populations), nor between
tissues within a single organism, but between environments which an organism is reasonably likely to
encounter.
Of all the major causes of extrinsic mortality, starvation is the only one that substantially alters the typical
organism's "optimal" rate of aging. Food scarcity makes it unlikely that any offspring one may have
would survive to adulthood, so it is preferable to divert effort into preserving oneself until food returns.
But when food is plentiful, the best strategy is to make good use of that food by turning it into progeny.
There is no way out of the fact that breeding is hard work, so the ideal strategy for fitness (maximising
progeny that survive to have their own) is to possess genetic machinery that will allow nutrient
availability to modulate the partitioning of energy expenditure between breeding now and maintaining the
ability to breed later. Unlike, for example, the presence or absence of predators, the presence or absence
of abundant food is something that often varies unpredictably during an organism’s life on a timescale
amenable to responses involving altering gene expression patterns, thus rewarding the possession of
genetic machinery to respond to such variation in a fitness-maximising manner. Physiologically, this is
manifest in very different ways in different organisms, from sporulation in yeast to dauer larva formation
in nematodes to cancer defence in laboratory mice; this is beginning to be seen to have resulted in the
appearance of quite stark differences in the downstream aspects of the gene expression pathways induced
by CR, even though the upstream aspects are highly conserved [20]. This general idea was introduced
independently about 15 years ago by Harrison and Archer [21] and by Holliday [22] and is generally
accepted. (Harrison and Archer [21] went further, as will be discussed below.) A potential objection is
that the most effective laboratory CR experiments do not impose simple starvation but typically
supplement the food with micronutrients, thus restricting calories but maintaining optimal nutrition.
However, one might expect the patterns of gene expression changes seen in CR experiments to be largely
independent of micronutrient supply, and indeed this was recently shown [23]. It should also be noted that
this logic still applies even in environments in which only a small proportion of individuals avoid
extrinsic causes of death (such as predation) long enough to benefit from their slowed aging, so long as
the chance of survival of any offspring born during a famine is low.
Nevertheless, this model does not explain all that one might like to explain about the magnitude of the
life-extension response – in particular, the stark non-proportionality of the maximum life extension
elicited (by any directly or indirectly starvation-related means) in different species. I present below a
refinement which seeks to do that.
5
The proportionality principle and its limitations
There is no persuasive evidence that evolution has remotely approached the limits of its ability to produce
progressively longer-lived organisms, even within the confines of potentially challenging “design
constraints” such as homeothermy. Rather, the non-existence of (for example) terrestrial mammals with a
life expectancy exceeding 100 years may well be a consequence merely of the non-existence until very
recently of a terrestrial mammal with a level of extrinsic mortality low enough to make such a life
expectancy evolutionarily advantageous. For example, evidence is accumulating that inherent resistance
to infection, manifest as a robust inflammatory response, accelerates the progression of a wide range of
age-related diseases including Alzheimer’s, cardiovascular disease and diabetes [24], and it also impairs
fertility [25]. This means that the dramatic reduction of early death from infections in developed nations
may already, after only a few generations, have begun to result in selection of traits that make humans
more susceptible to no-longer-life-threatening infections but less prone to autoinflammatory cascades
with cumulative impact on tissue function, increasing their longevity. Evidence that this is already
increasing life expectancy (via a population shift to a less pro-inflammatory cytokine profile) is now
available [26]. Interestingly, the rapidity of this response is comparable to the rate of percentage increase
of life expectancy per generation seen in populations of fruit flies reared under conditions selective for
increased lifespan [27]. It seems from these examples that there may be, to a first approximation, a
species-independent maximum percentage rate per generation of evolution of longer life inducible by an
abrupt alteration in environmental conditions – which would not be predicted if evolution were “running
out of ideas” to make long-lived organisms live even longer. We can thus propose that evolution of
longevity seems broadly to follow a “proportionality principle”.
The disposable soma theory admits similar analysis. Reproductive senescence may in certain
circumstances be an adaptation in its own right [28], but in general it precedes mortality in protected
environments simply because so few individuals survive to old age in the wild that there is negligible
selective pressure to build the reproductive system to function well for that long. Hence, an organism that
is evolving a longer lifespan will tend to evolve better reproductive system maintenance (and hence a
longer reproductive lifespan) to go with it; this is reflected in the quite limited cross-species variation in
the ratio of reproductive lifespan to total lifespan [29].
The same does not apply, however, to nutrient-mediated adaptation of the rate of aging. Being mainly a
function of the weather (via its effect on the abundance of vegetation), starvation of a given duration
necessarily occurs at broadly the same frequency, on average, for all terrestrial animals – whatever their
ad libitum-fed life expectancy. Here it is proposed that this is why the maximal observed effect of nutrient
shortage (or simulation of such shortage at the gene expression level) in worms and in mice differs only
by a factor of about 3 in the absolute amount of life extension, even though well-fed mice and worms
have lifespans differing by a factor of 50 or so: there is negligible evolutionary pressure for mice to
maintain the ability to vary their life expectancy by a factor of 5 on demand, whereas that pressure is
considerable for worms.
It should be stressed that the elicitation of life extension by genetic manipulation is in this regard much
more similar to a starvation response than to the effect of evolution, even though starvation does not
change genes and evolution does. This is because presently feasible (or indeed designable) genetic
manipulations are restricted to one or a few genes (though many other, unmanipulated genes may be
affected in consequence). A response to environmental conditions works similarly, with one or a few
genes reacting to changed metabolite concentrations (and thence perhaps altering the expression of many
others downstream), whereas evolution coordinately alters as many genes as the selective pressure
influences. A related point of possible confusion is that a species exposed for many generations to
sustained food shortages will evolve the ability to tolerate this, but must not at the same time adopt a
virtual or total cessation of reproduction as is warranted when the starvation is briefer. Thus, very longterm environmental challenges cannot be considered in the same way as those against which temporary
reallocation of resources away from reproduction is a viable defence.
6
What’s special about a year?
The concept that starvation will extend long-lived species’ lifespans by a smaller proportion than shortlived species’ lifespans was introduced by Harrison and Archer [21] and led them to predict a smaller lifeextension response to CR in the long-lived white-footed mouse, Peromyscus leucopus, than in Mus
musculus – something that has apparently still not been tested. However, as it stands, their model makes
no such prediction. Rather, the essentially random pattern of starvation in the wild would seem to predict
a maximal elicitable life extension that is proportional to the “ad-libitum” lifespan. The longer one lives,
the more starvation one is likely to experience over one’s whole lifetime; hence, the more pressure there
will be to live a given absolute amount longer, and conversely the absolute life extension that there is
enough pressure to elicit to cause the retention of the necessary genetic wherewithal will surely be greater
in longer-lived organisms.
The flaw in this logic is that the distribution of starvation is not scale-free; it is much “clumpier” at
timescales less than a year than at greater ones. That is to say: because of the strictly circannual cycle of
availability of vegetation (the direct or indirect food source of all terrestrial animals), the probability that
food will be scarce tomorrow, if it is scarce today, is high, and the probability that food will be scarce
next month, if it is scarce this month, is comparably high, but the probability that food will be scarce next
year, if it is scarce this year, is considerably lower (though still, presumably, higher than if food is
plentiful this year). This means that, on the one hand, if the frequency of a given severity of starvation
lasting a month is enough that a given species gains a fitness advantage by maintaining the genetic
machinery to survive it, then that species will also very probably gain a fitness advantage by surviving
that same severity of starvation for two months, or perhaps as much as a year (since lean summers
presage lean winters), because the frequency of that longer period of starvation will be comparable to the
frequency of the shorter one. On the other hand, however, the frequency of two successive years of
starvation is considerably less than the frequency of one, and may not be enough to make those two years
“worth” carrying around the machinery to survive even if it is worth surviving one such year. Further,
this makes no assumption that the periods of famine for all species in a given locale will coincide (which
may well not be true – losses in some populations may allow competing populations!to flourish even in
the face of a generally reduced common food supply): all that is asserted is that each species will
experience periods of famine, and that periods up to a year or so will be of roughly comparable frequency
but periods longer than that will be rarer.
This contrast between sub-year and multi-year “clumpiness” of starvation is more acute than it might at
first seem, because the frequency of starvation (or of any life-threatening environmental phenomenon)
that is needed for an organism to be better off maintaining the genetic wherewithall to exploit or survive it
is very low – much less than once per generation, since all that is needed is for those that have lost that
machinery by random mutation to be culled before too many individuals have lost it. On the assumption
that more severe famines are rarer than less severe ones, and on the further assumption that genetic
pathways conferring the ability to survive a severe famine will also (perhaps with minor adjustments)
confer the ability to survive a less severe one, such that only one major famine-survival mechanism is
likely to exist in a given organism, we can thus predict that the degree of famine which elicits the greatest
life extension that a given organism knows how to exhibit is one that is encountered quite rarely in nature
– just often enough to be worth knowing how to survive. Such famines would, therefore, be very unlikely
to occur in two consecutive years, even if there is a modest correlation between a famine one year and a
famine the next. This may at first seem at odds with the fact that the severity of CR used in typical mouse
experiments must actually be quite common in nature, but that challenge fails to take into account the
typical variability of nutrient availability during a long famine. Famines that are long and severe enough
to elicit the organism’s maximal life extension response may not in fact be all that rare in terms of
average severity measured over their entire duration, but most such famines may contain periods of
substantially worse food shortage, sufficient to kill the organism quickly. Only famines that do not
contain such periods are relevant to the selective pressure to survive.
7
However, it must be acknowledged that, even after taking the pattern of famine into account, a longerlived animal will have a greater chance of experiencing more famine in total and thus should be expected
to survive longer – just that the ratio of the maximum absolute life extensions should be much closer to
unity than the ratio of lifespans. This appears to be so for worms and mice, whose lifespans differ by a
factor of about 50 but whose maximum observed life extensions differ by only a factor of three.
Finally, we must consider the special case of equatorial environments, in which there are no seasons and
the above logic appears inapplicable. Here there is still a difference in “clumpiness” between sub-year
and multi-year availability of food, but only because there is an inherent inertia in that supply –
vegetation can appear or disappear only so fast. (This weaker version of the hypothesis is similar to that
presented by Harrison and Archer [21], though avoiding their reference to maximum reproductive
lifespan.) Accordingly, in equatorial regions there may be more variability than in temperate regions in
the maximal life extension elicited, as the absence of a season in which most vegetation disappears lets
the correlation of abundance of vegetation remain higher on a multi-year timescale than in non-equatorial
regions. This should, however, apply only to species that are restricted to equatorial regions: ones with
gene flow between equatorial and temperate regions will experience the effect of seasons on the
evolutionary pressure to retain life-extending capability of a given degree.
Relation between severity of famine and life extension elicited
The above considerations have perhaps been overlooked because of a clear cross-species relationship
between CR and life extension that is indeed proportional to lifespan: namely, that lifelong CR of a given
severity seems to elicit more or less the same proportional increase in life expectancy in a range of
species, up to a threshold of severity that kills them (with the boundary between these two outcomes
being quite a narrow range). The most effective degree of CR ever used in published mouse experiments,
53% reduction, gives about a 35% life extension [10], and it also does so in C. elegans [2]. How can this
be reconciled with the non-proportionality discussed above?
There is in fact no conflict between the two observations. The life extension machinery that is selected for
development/retention by evolution is that which applies to the most severe famine frequent enough to be
“worth” surviving, but we may expect a more muted expression of that same machinery to underpin the
organism’s response to milder famines too, just as, for example, the human response to mild fear is a mild
adrenalin release. The response of C. elegans to moderate starvation can thus be expected to be a milder
version of its most sophisticated response, which is to near-total starvation.
The same applies when comparing rodent data with that on longer-lived organisms (see below). The only
CR experiment yet completed on a species that lives over a decade used only 25% CR and gave a life
extension similar, as a proportion of lifespan, to that elicited by 25% CR in rodents. For most purposes it
would be inappropriate to compare very different levels of CR in different species, but in the present
context that is just what we must do, as we are concerned with the maximum life extension elicitable by
any starvation-related method, and no CR-induced life extension of dogs by more than this has yet been
reported.
Finally it should be mentioned that rodents of different strains exhibit a wide range of degrees of life
extension from the same degree of CR. Inbred mouse strains also vary considerably as to ad lib-fed
lifespan, due to a wide variety of genetic defects, and those with shorter lifespans tend to benefit most
from CR when life extension is measured as a proportion of lifespan. However, it would be dangerous to
extend the present hypothesis to this situation, since CR’s effect on resistance to genetic defects may not
be comparable to its effects on genetically healthier animals.
Compatibility with experimental data
The hypothesis outlined here might at first appear to be no more than statistical trickery, were it not for
the striking accordance of its predictions with available data. Experimental justification mentioned so far
for the minimal dependence of the maximum achievable absolute life extension on control lifespan has
8
been limited to the response to energy deprivation of wild-type mice and wild-type nematodes. However,
it is much more general; indeed, no exceptions are apparent (a representative sample of published data is
given in Table 1). I now examine the data in Table 1 in more detail, as well as addressing some examples
not included there which might at first sight be considered counterexamples to the present hypothesis.
The longest-lived species for which a life extension experiment using CR has been completed is the dog.
Labrador retrievers were given a diet reduced by 25% in calories from the age of 8 weeks. Their
maximum lifespan was about 14 months longer than that of control animals [11]. The authors of this
study did not discuss why they chose a CR level of 25%, rather than the 40% more typical of rodent CR
experiments; the hypothesis presented here predicts that a 40% CR regime would not extend the dogs’
lifespan further (and might be harmful).
This study [11] was predated by 30 years by a report on life extension by CR in cows [12], which has
been over-zealously summarised in some quarters and thus warrants comment. Pinney et al. compared
cows fed 60%, 120% and 200% of the amount thought optimal from November to April each year and ad
lib for the remaining months; also, each of these three groups was divided into two so as to examine the
effect of age at first parturition. (It is presumed, but not stated in this paper, that the high-fed group ate all
the food they were given during the winter.) The group fed the most and mated at the younger age
happened to experience a high early mortality. This resulted in a difference in mean lifespan between that
group and the early-parturition CR group of nearly four years, which if considered in isolation (as some
have done) would appear to conflict severely with the hypothesis presented here. When the experiment is
considered as a whole, however, and with the strenuous proviso that precise numbers are unavailable
because the experiment was terminated when 18% of the animals were still alive (including 10% of the
most-fed ones), it appears that the difference between the maximum lifespan (defined, as is customary, as
mean lifespan of the longest-lived 10%) of the CR (i.e. 60% optimal) group and those of the other groups
would probably have been no more than six months. Since the CR level imposed was too mild to impact
fertility severely, we may extrapolate that perhaps a year of life extension could have been achieved if CR
at the same level (40%) had been imposed year-round. However, it must be assiduously stressed that this
study was far too incomplete to be robustly compared with modern work (in any species) and is noted
here only in anticipation of overinterpretation of its findings in reaction to the present hypothesis.
Mention must also be made of the ongoing studies of non-human primate CR, since interim results from
these studies have led some to anticipate that they will demonstrate a multi-year life extension. There are
several reasons to be cautious in such a conclusion at this time, and in particular to avoid regarding
currently available data [30], which shows at least a five-year extension of median longevity in a group of
eight Rhesus monkeys on CR, as refuting the hypothesis presented in this paper. First, as stressed above,
we are interested here in maximum lifespan (currently unknown in the primate work), not mean or
median lifespan. This does not limit the scope of the hypothesis: all evolutionary arguments ultimately
revolve around the fittest subset of the population, since their descendants predominate in subsequent
generations. Second, we must not forget that primate husbandry is in its relative infancy compared to
rodent husbandry; mean lifespan extension induced by CR was larger in very early work than more
recently [31], perhaps because inadequate husbandry was a mortality risk against which CR protected.
This will be tested by the complete survival curves of control monkeys, since ideal husbandry should give
a distribution well approximated by a Gompertz curve, rather than a Gompertz-Makeham one with a
substantial age-independent mortality rate. Third, the maximum lifespan of model organisms may be a
better indicator than the mean of the mean lifespan of humans given the same sort of treatment, as will be
discussed below.
The fruit fly, Drosophila melanogaster, has received much attention from gerontologists studying life
extension for many years, especially since the landmark results of Rose and colleagues [27]. They
eventually achieved a doubling of life expectancy by selection for late-age fecundity. More recently,
several D. melanogaster mutants have been isolated that show a comparable life extension [32-35].
However, this is clearly much less than a year. Is this then an exception to the generalisation under
9
discussion? Not at all, because it is quite straightforward to increase flies’ lifespan by several months, by
inducing a state known as diapause [7].
Diapause is the state in which many insects survive the winter. It somewhat resembles mammalian
hibernation. This largely explains why it has been neglected by most gerontologists – the long-standing
appreciation of the relationship between aging and metabolic rate has diverted attention from life
extension phenomena that appear to be caused “merely” by low body temperature. However, as recently
reviewed by Tatar [8], diapause is not in fact simply a low-temperature phenomenon. First, in some
insects it occurs as a defence not against cold but against desiccation in the summer: grasshoppers are
examples of this [9]. Second, it can occur in the absence of any environmental change whatever, but
purely because of the animal’s life cycle. Prominent examples are Monarch butterflies, which live several
times as long during migration as at other times of year despite a greatly elevated metabolic rate during
migration and low food intake during overwintering [36].
Finally, mention must be made of certain very primitive species which are able to enter a state of such
extreme metabolic suspension that they can survive more or less indefinitely. Sporulation in yeast and
anhydrobiosis in rotifers are the best-known examples. Here the simplest explanation is that the way that
evolution has found to extend the organism’s lifespan by the useful amount (several months) is to shut it
down completely, so that no metabolic activity at all occurs until the suspension is reversed by external
chemical cues. This is a response that would only be expected in extremely simple organisms, since
complete but reversible cessation of metabolism is an operation whose complexity necessarily varies with
that of the organism. Hence, it is reasonable not to regard the indefinite lifespan of organisms in such
states as a meaningful challenge to the hypothesis presented here.
The non-proportionality principle described thus far for wild-type animals should in theory apply equally
to genetic or pharmacological interventions that modulate the genetic pathways underlying this
starvation-inducible life extension. Is this confirmed experimentally? It appears so. In all the best-studied
model organisms, interventions have been found that increase lifespan by a large proportion of the
amount which natural environmental cues produce [35,37,38], but no such intervention has yet extended
any organism's lifespan appreciably beyond that amount, except for C. elegans if the dauer state really
does have a maximum survival of only three months (discussed above). Nor should we expect them to,
since the same pathways are being modulated. (It is, however, a strong prediction of the present
hypothesis that C. elegans dauer survival should at least approach six months under ideal conditions.)
A challenge to the validity of the above assertion is that wild-type nematodes exhibit a 60% increase in
lifespan resulting from ablation of the germ cells in adulthood [38], and when this surgery is done on
already long-lived mutants the life extension is a factor of six [5]. This is a stage in the life cycle at which
nutritional modulation of wild-type worms cannot elicit nearly so much life extension, since the dauer
state cannot be entered. However, the genetic pathways triggered by this intervention resemble those that
accompany entry into the dauer state [5,38]. Hence, though this is an important demonstration that
something like the degree of life extension achievable in early life can also in principle be induced in
adulthood (if the relevant gene expression can be artificially induced), it does not contradict the idea that
modulation of endogenous genetic machinery related to the starvation response can usually confer several
months’ extra life but cannot extend any organism’s lifespan by much more than a year.
Implications for retarding human aging by modulating nutrient sensing
A generalisation that holds across species ranging in lifespan from a few weeks to a decade seems very
likely to extend to species living several decades. The argument presented here suggests, therefore, that
humans are likely to have “forgotten” how to respond to nutrient deprivation by slowing their aging by as
much as 30-40%, as mice can and our common ancestors probably also could. We may well have retained
the residual ability to slow our aging enough to confer a couple of years of life extension, but the chance
that we can obtain a decade or more is slim.
10
The focus in this article has been on maximum, rather than mean, lifespan. Surely, it may be argued, any
intervention that can increase mean lifespan by delaying major life-threatening diseases must be urgently
pursued, even if maximum lifespan is not increased appreciably? That is certainly true. However, the
postponement of such diseases in humans has already been very substantial in the past century, leading to
a considerable rise in life expectancy in the industrialised world, so we must ask whether additional
measures, which might indeed delay the onset of such diseases [39], would also delay death from them. A
major reason for caution here is that a number of recent studies have shown that the phenomenon of
“rectangularisation” – broadly, the increase of mean lifespan faster than maximum lifespan – has ceased
throughout the industrialised world [40]. In other words, we are now succeeding in extending mean
lifespan only by means that also extend maximum lifespan; thus, an intervention that does not appreciably
extend maximum lifespan is also unlikely to extend mean lifespan in humans, even if it does so in other
animals. This is not to belittle the postponement of the onset of such diseases, by any means – only to
guard against unsupported hopes with regard to lifespan.
Additionally, it appears unlikely that pharmacological or genetic induction of the pathways induced by
CR will everconfer appreciably greater life extension than starvation itself can in a given species. As
noted above, the sole exception to this generalisation observed so far is the extension of C. elegans
lifespan to twice what has been seen in the dauer response to starvation, and this may be due merely to a
lack of attempts to maximise dauer longevity by varying certain experimental conditions, or to the
possibility that dauers simply starve to death rather than aging.
These points appear to have been overlooked by those who feel that manipulation of genetic pathways
involved in nutrient sensing has sufficient biomedical potential to merit the founding of companies to
develop such interventions for human use [41]. The goal of human life extension research is the extension
of human healthy life expectancy by much more than a couple of years. If that is all that would result
from taking a new drug for much of our life then we are palpably better advised to desist – thereby
avoiding the necessarily unknown risk of long-term side-effects – and wait for something much more
powerful to appear, whose greater efficacy outweighs its later availability. Indeed, some who are
enthusiastic about near-term prospects for commercial success of drugs that modulate nutrient sensing
and related pathways publicly base their optimism on the idea that such interventions may give us 10-20
years of extra life [42]. The logic presented in this article shows that there is no basis for such a belief: not
only is there abundant data contradicting the “proportionality principle” upon which that belief is based, it
should not even be considered the null hypothesis. Health benefits may indeed result from such drugs,
however, [39] so it would be inappropriate to discourage their development; but realism as to what they
will and will not be likely to achieve is desirable.
Acknowledgements
I am indebted to Jay Olshansky, Tom Perls and two anonymous reviewers, one of them exceptionally
painstaking, whose comments substantially improved the manuscript.
References
1.
McCay CM, Crowell MF, Maynard LA: The effect of retarded growth upon the length of life span
and upon the ultimate body size. J Nutr 1935;10:63-79.
2.
Klass MR: Aging in the nematode Caenorhabditis elegans: major biological and environmental
factors influencing life span. Mech Ageing Dev 1977;6:413-429.
3.
Riddle DL, Swanson MM, Albert PS: Interacting genes in nematode dauer larva formation. Nature
1981;290:668-671.
4.
Klass M, Hirsh D: Non-ageing developmental variant of Caenorhabditis elegans. Nature
1976;260:523-525.
11
5.
Arantes-Oliveira N, Berman JR, Kenyon C: Healthy animals with extreme longevity. Science
2003;302:611.
6.
Larsen PL, Clarke CF: Extension of life-span in Caenorhabditis elegans by a diet lacking coenzyme
Q. Science 2002;295:120-123.
7.
Lumme J, Oikarinen A, Lakovaara S, Alatalo R: The environmental regulation of adult diapause in
Drosophila littoralis. J Insect Physiol 1974;20:2023-2033.
8.
Tatar M, Yin C: Slow aging during insect reproductive diapause: why butterflies, grasshoppers and
flies are like worms. Exp Gerontol 2001;36:723-738.
9.
Uvarov B: Hibernation of active stages of Acridoidea in temperate climates. Atti Acad Gioenia Sci
Nat 1966;6:175-189.
10. Weindruch R, Walford RL, Fligiel S, Guthrie D: The retardation of aging in mice by dietary
restriction: longevity, cancer, immunity and lifetime energy intake. J Nutr 1986;116:641-654.
11. Kealy RD, Lawler DF, Ballam JM, Mantz SL, Biery DN, Greeley EH, Lust G, Segre M, Smith GK,
Stowe HD: Effects of diet restriction on life span and age-related changes in dogs. J Am Vet Med
Assoc 2002;220:1315-1320.
12. Pinney DO, Stephens DF, Pope LS. Lifetime effects of winter supplemental feed level and age at first
parturition on range beef cows. J Anim Sci 1972;34:1067-1077.
13. Weismann A: Essays upon heredity and kindred biological problems, 2nd ed., vol. 1. Oxford,
Clarendon Press, 1891.
14. Medawar PB: An unsolved problem in biology. London, H.K. Lewis, 1952.
15. Williams GC: Pleiotropy, natural selection and the evolution of senescence. Evolution 1957;11:398411.
16. Edney EB, Gill RW: Evolution of senescence and specific longevity. Nature 1967;220:281-282.
17. Austad SN: Retarded senescence in an insular population of Virginia opossums (Didelphis
virginiana). J Zool 1993;229:695-708.
18. Kirkwood TBL: Evolution of ageing. Nature 1977;270:301-304.
19. Karahalil B, Hogue BA, de Souza-Pinto NC, Bohr VA: Base excision repair capacity in mitochondria
and nuclei: tissue-specific variations. FASEB J 2002;16:1895-1902.
20. Motta MC, Divecha N, Lemieux M, Kamel C, Chen D, Gu W, Bultsma Y, McBurney M, Guarente
L: Mammalian SIRT1 represses forkhead transcription factors. Cell 2004;116:551-563.
21. Harrison DE, Archer JR: Natural selection for extended longevity from food restriction. Growth Dev
Aging 1988;52:65.
22. Holliday R: Food, reproduction and longevity: is the extended lifespan of calorie-restricted animals
an evolutionary adaptation? BioEssays 1989;10:125-127.
23. Bauer M, Hamm AC, Bonaus M, Jacob A, Jaekel J, Schorle H, Pankratz MJ, Katzenberger JD:
Starvation response in mouse liver shows strong correlation with lifespan prolonging processes.
Physiol Genomics 2004;17:230-244.
24. Westendorp RG: Leiden research program on ageing. Exp Gerontol 2002;37:609-614.
25. Westendorp RG, van Dunne FM, Kirkwood TB, Helmerhorst FM, Huizinga TW: Optimizing human
fertility and survival. Nat Med 2001;7:873.
26. Lio D, Scola L, Crivello A, Colonna-Romano G, Candore G, Bonafe M, Cavallone L, Marchegiani F,
Olivieri F, Franceschi C, Caruso C: Inflammation, genetics, and longevity: further studies on the
12
protective effects in men of IL-10 -1082 promoter SNP and its interaction with TNF-alpha -308
promoter SNP. J Med Genet 2003;40:296-299.
27. Rose MR: Laboratory evolution of postponed senescence in Drosophila melanogaster. Evolution
1984;38:1004-1010.
28. Crespi BJ, Teo R: Comparative phylogenetic analysis of the evolution of semelparity and life history
in salmonid fishes. Evolution Int J Org Evolution 2002;56:1008-1020.
29. Holliday R: Understanding Ageing. Cambridge, Cambridge University Press, 1995.
30. Bodkin NL, Alexander TM, Ortmeyer HK, Johnson E, Hansen BC: Mortality and morbidity in
laboratory-maintained Rhesus monkeys and effects of long-term dietary restriction. J Gerontol A Biol
Sci Med Sci 2003;58:212-219.
31. Masoro EJ: Subfield history: caloric restriction, slowing aging, and extending life. Sci Aging
Knowledge Environ 2003;2003(8):RE2.
32. Lin YJ, Seroude L, Benzer S: Extended life-span and stress resistance in the Drosophila mutant
methuselah. Science 1998;282:943-946.
33. Rogina B, Reenan RA, Nilsen SP, Helfand SL: Extended life-span conferred by cotransporter gene
mutations in Drosophila. Science 2000;290:2137-2140.
34. Clancy DJ, Gems D, Harshman LG, Oldham S, Stocker H, Hafen E, Leevers SJ, Partridge L:
Extension of life-span by loss of CHICO, a Drosophila insulin receptor substrate protein. Science
2001;292:104-106.
35. Tatar M, Kopelman A, Epstein D, Tu MP, Yin CM, Garofalo RS: A mutant Drosophila insulin
receptor homolog that extends life-span and impairs neuroendocrine function. Science 2001;292:107110.
36. Herman WS, Tatar M: Juvenile hormone regulation of longevity in the migratory monarch butterfly.
Proc R Soc Lond B Biol Sci 2001;268:2509-2514.
37. Bartke A, Coschigano K, Kopchick J, Chandrashekar V, Mattison J, Kinney B, Hauck S: Genes that
prolong life: relationships of growth hormone and growth to aging and life span. J Gerontol A Biol
Sci Med Sci 2001;56:B340-B349
38. Hsin H, Kenyon C: Signals from the reproductive system regulate the lifespan of C. elegans. Nature
1999;399:362-366.
39. Fontana L, Meyer TE, Klein S, Holloszy JO. Long-term calorie restriction is highly effective in
reducing the risk for atherosclerosis in humans. Proc Natl Acad Sci U S A. 2004;101:6659-6663.
40. Yashin AI, Begun AS, Boiko SI, Ukraintseva SV, Oeppen J. The new trends in survival improvement
require a revision of traditional gerontological concepts. Exp Gerontol. 2001 Dec;37(1):157-67.
41. Stuart M: The Genomics of Longevity. Windhover's Review of Emerging Medical Ventures
2003;8:22.
42. Guarente L: Ageless Quest. New York, Cold Spring Harbor Laboratory Press, 2003.