Sunday, February 23, 2014

Relaxed selection can lead to surprising outcomes

Our understanding and expectations of how natural selection should lead to adaptive evolution in the wild is heavily informed by predictions derived from theoretical and laboratory studies. These, although useful in expanding our understanding of evolutionary processes, tend to consider only a single source of selection, and thus do not encompass the complexity of natural environments where multiple sources of selection interact simultaneously. Field surveys, on the other hand, do integrate the complexity of nature, but are limited in their ability to disentangle cause and effect. As a result, field studies often do match our predictions – but sometimes our predictions totally miss the mark, and it is difficult to figure out why. A way of bridging these two approaches is to experimentally manipulate an environmental factor of interest in nature, allowing cause and effect to be connected even in a complex system.

Evidence that drastic changes in an organism’s environment can lead to rapid adaptive evolution has proliferated in the past 20 years.These drastic changes come in many forms and flavours, but they generally involve the emergence – or, less frequently, the loss – of a selective factor. Several studies in the wild report that introducing new predators, contaminants, competitors, or parasites can lead to an increase in the ability of the affected populations to deal with the new stressors; the population adapts to the new source of selection. We commonly assume that the reverse should also be true: the loss of a source of selection (i.e. relaxed selection) should lead to the loss of the ability to deal with that stressor. However, there is good evidence suggesting that traits might not change for relatively long periods of time after the loss of a source of selection (see Lahti et al. 2009), perhaps because change might be driven only by drift, if the cost of the adaptation to the stressor was very low.

Through a series of field introductions in collaboration with the FIBR Guppy Project we tested whether removing Gyrodactylus spp., a common and deleterious ectoparasite of wild guppies, would lead to the evolution of decreased resistance in their guppy hosts. As it turned out, this experiment was one of those studies where predictions totally missed the mark, and because of that it broadened our understanding of the evolution of resistance to parasites in the wild.In a nutshell, theory suggests that individuals that invest more in defence against parasites are expected to do so at the cost of investing in other fitness-enhancing traits. When parasite-induced mortality increases in a population, those individuals that are better able to resist infection will have a higher genetic representation in future generations. On the flip side, when the negative effects of parasitism decrease, those individuals that, for example, invest more in reproduction than defence will have a competitive advantage over resistant individuals. A number of elegant laboratory experiments support these predictions, but these mostly focus on “simple” organisms (typically bacteria and protozoans, although there are some important arthropod studies).

The rain forest of Trinidad: home to guppies and their parasites
(Photo A. Hendry)
Given these clear predictions, we set out to test whether, under the complexities of a natural environment, removal of parasites would lead to the evolution of decreased defence to parasites in a vertebrate host with a complex immune system. What we found was surprising. Both four and eight generations after being released from parasitism, our female guppies had repeatedly increased – yes, increased! – their resistance to the parasite relative to the ancestral population (read the paper here).

This increase in resistance could have been due to various methodological artefacts, and we thought of some biological hypotheses for the outcome as well. We ruled out methodological artefacts such as differences among populations in mortality (it was not the case that less-resistant individuals died earlier, albeit with less parasites, than more-resistant individuals) or size (it was not the case that that the ancestral individuals were larger and thus provided more resources for a larger parasite load which would have made them appear to be less resistant). Among the biological explanations that we discarded was the possibility that we were actually seeing the effects of selection for tolerance, not for resistance. In the parasite literature, resistance refers to the ability of hosts to reduce the number of parasites they have, while tolerance is the ability to reduce the damage caused by a given number of parasites. Resistance and tolerance are expected to trade off against each other: high resistance means that parasite numbers are low, so investing in reducing their negative effects would be a waste of resources; high tolerance means parasites cause little damage, so investing in reducing their number would also be a waste of resources. If parasite removal was causing selection for decreased tolerance, then resistance could be indirectly increasing as a response. But we found that tolerance did not change much – if anything, it also increased after parasite removal.

Experimental guppies, parasites and me in the lab (Photo: G. Capurro)
We present several lines of evidence that suggest that this increase in resistance after decreasing selection from parasites is also a common outcome in wild guppy populations. We think this has to do with one of the “interacting factors” present in natural systems: predation. As part of the experimental translocations our guppy populations faced a strong shift not only in parasite pressure, but also in predation-induced mortality, since they were translocated into sites where major predators were absent. Changes in predation have been shown to induce rapid evolution of life-history traits in guppies, and some life-history traits such as life expectancy are known to correlate with resistance to parasites. In our case, it seems that release from predation and the evolved increase in life expectancy that it brings could be pleiotropically producing increased resistance (an idea known as the pace-of-life hypothesis). In the wild, parasites generally don’t just disappear, so this pleiotropic connection between lifespan and resistance would be adaptive: when predators are present,an early death is likely and resistance to parasites is an unnecessary luxury, so individuals might use all their resources in reproducing as early as they can, but when early death due to predation is unlikely, low parasite resistance might greatly reduce your fitness.

Like all good research, this study has led us to new questions. For example, how often does the pace-of-life hypothesis apply in other systems, and under what circumstances? And more generally, is predation always a stronger selective force than parasitism, even for traits such as resistance that are directly related to parasites?

The paper:

    Dargent F., Scott M.E., Hendry A.P., Fussmann G.F. 2013. Experimental elimination of parasites in nature leads to the evolution of increased resistance in hosts. Proc R Soc Lond B Biol Sci 280(1773). (doi:10.1098/rspb.2013.2371).

No comments:

Post a Comment

A 25-year quest for the Holy Grail of evolutionary biology

When I started my postdoc in 1998, I think it is safe to say that the Holy Grail (or maybe Rosetta Stone) for many evolutionary biologists w...