Theoretical ecology is the scientific discipline devoted to the study of ecological systems using theoretical methods such as simple conceptual models, mathematical models, computational simulations, and advanced data analysis. Effective models improve understanding of the natural world by revealing how the dynamics of species populations are often based on fundamental biological conditions and processes. Further, the field aims to unify a diverse range of empirical observations by assuming that common, mechanistic processes generate observable phenomena across species and ecological environments. Based on biologically realistic assumptions, theoretical ecologists are able to uncover novel, non-intuitive insights about natural processes. Theoretical results are often verified by empirical and observational studies, revealing the power of theoretical methods in both predicting and understanding the noisy, diverse biological world.
The field is broad and includes foundations in applied mathematics, computer science, biology, statistical physics, genetics, chemistry, evolution, and conservation biology. Theoretical ecology aims to explain a diverse range of phenomena in the life sciences, such as population growth and dynamics, fisheries, competition, evolutionary theory, epidemiology, animal behavior and group dynamics, food webs, ecosystems, spatial ecology, and the effects of climate change.
Theoretical ecology has further benefited from the advent of fast computing power, allowing the analysis and visualization of large-scale computational simulations of ecological phenomena. Importantly, these modern tools provide quantitative predictions about the effects of human induced environmental change on a diverse variety of ecological phenomena, such as: species invasions, climate change, the effect of fishing and hunting on food network stability, and the global carbon cycle.
As in most other sciences, mathematical models form the foundation of modern ecological theory.
Models are often used to describe real ecological reproduction processes of single or multiple species. These can be modelled using stochastic branching processes. Examples are the dynamics of interacting populations (predation competition and mutualism), which, depending on the species of interest, may best be modeled over either continuous or discrete time. Other examples of such models may be found in the field of mathematical epidemiology where the dynamic relationships that are to be modeled are host–pathogen interactions.
Bifurcation theory is used to illustrate how small changes in parameter values can give rise to dramatically different long run outcomes, a mathematical fact that may be used to explain drastic ecological differences that come about in qualitatively very similar systems. Logistic maps are polynomial mappings, and are often cited as providing archetypal examples of how chaotic behaviour can arise from very simple non-linear dynamical equations. The maps were popularized in a seminal 1976 paper by the theoretical ecologist Robert May. The difference equation is intended to capture the two effects of reproduction and starvation.
In 1930, R.A. Fisher published his classic The Genetical Theory of Natural Selection, which introduced the idea that frequency-dependent fitness brings a strategic aspect to evolution, where the payoffs to a particular organism, arising from the interplay of all of the relevant organisms, are the number of this organism' s viable offspring. In 1961, Richard Lewontin applied game theory to evolutionary biology in his Evolution and the Theory of Games, followed closely by John Maynard Smith, who in his seminal 1972 paper, “Game Theory and the Evolution of Fighting", defined the concept of the evolutionarily stable strategy.
Because ecological systems are typically nonlinear, they often cannot be solved analytically and in order to obtain sensible results, nonlinear, stochastic and computational techniques must be used. One class of computational models that is becoming increasingly popular are the agent-based models. These models can simulate the actions and interactions of multiple, heterogeneous, organisms where more traditional, analytical techniques are inadequate. Applied theoretical ecology yields results which are used in the real world. For example, optimal harvesting theory draws on optimization techniques developed in economics, computer science and operations research, and is widely used in fisheries.
Population ecology is a sub-field of ecology that deals with the dynamics of species populations and how these populations interact with the environment. It is the study of how the population sizes of species living together in groups change over time and space, and was one of the first aspects of ecology to be studied and modelled mathematically.
The most basic way of modeling population dynamics is to assume that the rate of growth of a population depends only upon the population size at that time and the per capita growth rate of the organism. In other words, if the number of individuals in a population at a time t, is N(t), then the rate of population growth is given by:
where r is the per capita growth rate, or the intrinsic growth rate of the organism. It can also be described as r = b-d, where b and d are the per capita time-invariant birth and death rates, respectively. This first order linear differential equation can be solved to yield the solution
a trajectory known as Malthusian growth, after Thomas Malthus, who first described its dynamics in 1798. A population experiencing Malthusian growth follows an exponential curve, where N(0) is the initial population size. The population grows when r > 0, and declines when r < 0. The model is most applicable in cases where a few organisms have begun a colony and are rapidly growing without any limitations or restrictions impeding their growth (e.g. bacteria inoculated in rich media).
The exponential growth model makes a number of assumptions, many of which often do not hold. For example, many factors affect the intrinsic growth rate and is often not time-invariant. A simple modification of the exponential growth is to assume that the intrinsic growth rate varies with population size. This is reasonable: the larger the population size, the fewer resources available, which can result in a lower birth rate and higher death rate. Hence, we can replace the time-invariant r with r’(t) = (b –a*N(t)) – (d + c*N(t)), where a and c are constants that modulate birth and death rates in a population dependent manner (e.g. intraspecific competition). Both a and c will depend on other environmental factors which, we can for now, assume to be constant in this approximated model. The differential equation is now:
This can be rewritten as:
where r = b-d and K = (b-d)/(a+c).
The biological significance of K becomes apparent when stabilities of the equilibria of the system are considered. The constant K is the carrying capacity of the population. The equilibria of the system are N = 0 and N = K. If the system is linearized, it can be seen that N = 0 is an unstable equilibrium while K is a stable equilibrium.
Another assumption of the exponential growth model is that all individuals within a population are identical and have the same probabilities of surviving and of reproducing. This is not a valid assumption for species with complex life histories. The exponential growth model can be modified to account for this, by tracking the number of individuals in different age classes (e.g. one-, two-, and three-year-olds) or different stage classes (juveniles, sub-adults, and adults) separately, and allowing individuals in each group to have their own survival and reproduction rates. The general form of this model is
where Nt is a vector of the number of individuals in each class at time t and L is a matrix that contains the survival probability and fecundity for each class. The matrix L is referred to as the Leslie matrix for age-structured models, and as the Lefkovitch matrix for stage-structured models.
If parameter values in L are estimated from demographic data on a specific population, a structured model can then be used to predict whether this population is expected to grow or decline in the long-term, and what the expected age distribution within the population will be. This has been done for a number of species including loggerhead sea turtles and right whales.
An ecological community is a group of trophically similar, sympatric species that actually or potentially compete in a local area for the same or similar resources. Interactions between these species form the first steps in analyzing more complex dynamics of ecosystems. These interactions shape the distribution and dynamics of species. Of these interactions, predation is one of the most widespread population activities. Taken in its most general sense, predation comprises predator–prey, host–pathogen, and host–parasitoid interactions.
Predator–prey interactions exhibit natural oscillations in the populations of both predator and the prey. In 1925, the American mathematician Alfred J. Lotka developed simple equations for predator–prey interactions in his book on biomathematics. The following year, the Italian mathematician Vito Volterra, made a statistical analysis of fish catches in the Adriatic and independently developed the same equations. It is one of the earliest and most recognised ecological models, known as the Lotka-Volterra model:
where N is the prey and P is the predator population sizes, r is the rate for prey growth, taken to be exponential in the absence of any predators, α is the prey mortality rate for per-capita predation (also called ‘attack rate’), c is the efficiency of conversion from prey to predator, and d is the exponential death rate for predators in the absence of any prey.
Volterra originally used the model to explain fluctuations in fish and shark populations after fishing was curtailed during the First World War. However, the equations have subsequently been applied more generally. Other examples of these models include the Lotka-Volterra model of the snowshoe hare and Canadian lynx in North America, any infectious disease modeling such as the recent outbreak of SARS  and biological control of California red scale by the introduction of its parasitoid, Aphytis melinus .
A credible, simple alternative to the Lotka-Volterra predator–prey model and their common prey dependent generalizations is the ratio dependent or Arditi-Ginzburg model. The two are the extremes of the spectrum of predator interference models. According to the authors of the alternative view, the data show that true interactions in nature are so far from the Lotka–Volterra extreme on the interference spectrum that the model can simply be discounted as wrong. They are much closer to the ratio-dependent extreme, so if a simple model is needed one can use the Arditi–Ginzburg model as the first approximation.
The second interaction, that of host and pathogen, differs from predator–prey interactions in that pathogens are much smaller, have much faster generation times, and require a host to reproduce. Therefore, only the host population is tracked in host–pathogen models. Compartmental models that categorize host population into groups such as susceptible, infected, and recovered (SIR) are commonly used.
The third interaction, that of host and parasitoid, can be analyzed by the Nicholson-Bailey model, which differs from Lotka-Volterra and SIR models in that it is discrete in time. This model, like that of Lotka-Volterra, tracks both populations explicitly. Typically, in its general form, it states:
where f(Nt, Pt) describes the probability of infection (typically, Poisson distribution), λ is the per-capita growth rate of hosts in the absence of parasitoids, and c is the conversion efficiency, as in the Lotka-Volterra model.
In studies of the populations of two species, the Lotka-Volterra system of equations has been extensively used to describe dynamics of behavior between two species, N1 and N2. Examples include relations between D. discoiderum and E. coli, as well as theoretical analysis of the behavior of the system.
The r coefficients give a “base” growth rate to each species, while K coefficients correspond to the carrying capacity. What can really change the dynamics of a system, however are the α terms. These describe the nature of the relationship between the two species. When α12 is negative, it means that N2 has a negative effect on N1, by competing with it, preying on it, or any number of other possibilities. When α12 is positive, however, it means that N2 has a positive effect on N1, through some kind of mutualistic interaction between the two. When both α12 and α21 are negative, the relationship is described as competitive. In this case, each species detracts from the other, potentially over competition for scarce resources. When both α12 and α21 are positive, the relationship becomes one of mutualism. In this case, each species provides a benefit to the other, such that the presence of one aids the population growth of the other.
Unified neutral theory is a hypothesis proposed by Stephen Hubbell in 2001. The hypothesis aims to explain the diversity and relative abundance of species in ecological communities, although like other neutral theories in ecology, Hubbell's hypothesis assumes that the differences between members of an ecological community of trophically similar species are "neutral," or irrelevant to their success. Neutrality means that at a given trophic level in a food web, species are equivalent in birth rates, death rates, dispersal rates and speciation rates, when measured on a per-capita basis. This implies that biodiversity arises at random, as each species follows a random walk. This can be considered a null hypothesis to niche theory. The hypothesis has sparked controversy, and some authors consider it a more complex version of other null models that fit the data better.
Under unified neutral theory, complex ecological interactions are permitted among individuals of an ecological community (such as competition and cooperation), providing all individuals obey the same rules. Asymmetric phenomena such as parasitism and predation are ruled out by the terms of reference; but cooperative strategies such as swarming, and negative interaction such as competing for limited food or light are allowed, so long as all individuals behave the same way. The theory makes predictions that have implications for the management of biodiversity, especially the management of rare species. It predicts the existence of a fundamental biodiversity constant, conventionally written θ, that appears to govern species richness on a wide variety of spatial and temporal scales.
Biogeography is the study of the distribution of species in space and time. It aims to reveal where organisms live, at what abundance, and why they are (or are not) found in a certain geographical area.
Biogeography is most keenly observed on islands, which has led to the development of the subdiscipline of island biogeography. These habitats are often a more manageable areas of study because they are more condensed than larger ecosystems on the mainland. In 1967, Robert MacArthur and E.O. Wilson published The Theory of Island Biogeography. This showed that the species richness in an area could be predicted in terms of factors such as habitat area, immigration rate and extinction rate. The theory is considered one of the fundamentals of ecological theory. The application of island biogeography theory to habitat fragments spurred the development of the fields of conservation biology and landscape ecology.
A population ecology concept is r/K selection theory, one of the first predictive models in ecology used to explain life-history evolution. The premise behind the r/K selection model is that natural selection pressures change according to population density. For example, when an island is first colonized, density of individuals is low. The initial increase in population size is not limited by competition, leaving an abundance of available resources for rapid population growth. These early phases of population growth experience density-independent forces of natural selection, which is called r-selection. As the population becomes more crowded, it approaches the island's carrying capacity, thus forcing individuals to compete more heavily for fewer available resources. Under crowded conditions, the population experiences density-dependent forces of natural selection, called K-selection.
Spatial analysis of ecological systems often reveals that assumptions that are valid for spatially homogenous populations – and indeed, intuitive – may no longer be valid when migratory subpopulations moving from one patch to another are considered. In a simple one-species formulation, a subpopulation may occupy a patch, move from one patch to another empty patch, or die out leaving an empty patch behind. In such a case, the proportion of occupied patches may be represented as
where m is the rate of colonization, and e is the rate of extinction. In this model, if e < m, the steady state value of p is 1 – (e/m) while in the other case, all the patches will eventually be left empty. This model may be made more complex by addition of another species in several different ways, including but not limited to game theoretic approaches, predator–prey interactions, etc. We will consider here an extension of the previous one-species system for simplicity. Let us denote the proportion of patches occupied by the first population as p1, and that by the second as p2. Then,
In this case, if e is too high, p1 and p2 will be zero at steady state. However, when the rate of extinction is moderate, p1 and p2 can stably coexist. The steady state value of p2 is given by
(p*1 may be inferred by symmetry). If e is zero, the dynamics of the system favor the species that is better at colonizing (i.e. has the higher m value). This leads to a very important result in theoretical ecology known as the Intermediate Disturbance Hypothesis, where the biodiversity (the number of species that coexist in the population) is maximized when the disturbance (of which e is a proxy here) is not too high or too low, but at intermediate levels.
The form of the differential equations used in this simplistic modelling approach can be modified. For example:
The model can also be extended to combinations of the four possible linear or non-linear dependencies of colonization and extinction on p are described in more detail in.
Introducing new elements, whether biotic or abiotic, into ecosystems can be disruptive. In some cases, it leads to ecological collapse, trophic cascades and the death of many species within the ecosystem. The abstract notion of ecological health attempts to measure the robustness and recovery capacity for an ecosystem; i.e. how far the ecosystem is away from its steady state. Often, however, ecosystems rebound from a disruptive agent. The difference between collapse or rebound depends on the toxicity of the introduced element and the resiliency of the original ecosystem.
If ecosystems are governed primarily by stochastic processes, through which its subsequent state would be determined by both predictable and random actions, they may be more resilient to sudden change than each species individually. In the absence of a balance of nature, the species composition of ecosystems would undergo shifts that would depend on the nature of the change, but entire ecological collapse would probably be infrequent events. In 1997, Robert Ulanowicz used information theory tools to describe the structure of ecosystems, emphasizing mutual information (correlations) in studied systems. Drawing on this methodology and prior observations of complex ecosystems, Ulanowicz depicts approaches to determining the stress levels on ecosystems and predicting system reactions to defined types of alteration in their settings (such as increased or reduced energy flow, and eutrophication.
Ecopath is a free ecosystem modelling software suite, initially developed by NOAA, and widely used in fisheries management as a tool for modelling and visualising the complex relationships that exist in real world marine ecosystems.
Food webs provide a framework within which a complex network of predator–prey interactions can be organised. A food web model is a network of food chains. Each food chain starts with a primary producer or autotroph, an organism, such as a plant, which is able to manufacture its own food. Next in the chain is an organism that feeds on the primary producer, and the chain continues in this way as a string of successive predators. The organisms in each chain are grouped into trophic levels, based on how many links they are removed from the primary producers. The length of the chain, or trophic level, is a measure of the number of species encountered as energy or nutrients move from plants to top predators. Food energy flows from one organism to the next and to the next and so on, with some energy being lost at each level. At a given trophic level there may be one species or a group of species with the same predators and prey.
In 1927, Charles Elton published an influential synthesis on the use of food webs, which resulted in them becoming a central concept in ecology. In 1966, interest in food webs increased after Robert Paine's experimental and descriptive study of intertidal shores, suggesting that food web complexity was key to maintaining species diversity and ecological stability. Many theoretical ecologists, including Sir Robert May and Stuart Pimm, were prompted by this discovery and others to examine the mathematical properties of food webs. According to their analyses, complex food webs should be less stable than simple food webs.:75–77:64 The apparent paradox between the complexity of food webs observed in nature and the mathematical fragility of food web models is currently an area of intensive study and debate. The paradox may be due partially to conceptual differences between persistence of a food web and equilibrial stability of a food web.
Systems ecology can be seen as an application of general systems theory to ecology. It takes a holistic and interdisciplinary approach to the study of ecological systems, and particularly ecosystems. Systems ecology is especially concerned with the way the functioning of ecosystems can be influenced by human interventions. Like other fields in theoretical ecology, it uses and extends concepts from thermodynamics and develops other macroscopic descriptions of complex systems. It also takes account of the energy flows through the different trophic levels in the ecological networks. In systems ecology the principles of ecosystem energy flows are considered formally analogous to the principles of energetics. Systems ecology also considers the external influence of ecological economics, which usually is not otherwise considered in ecosystem ecology. For the most part, systems ecology is a subfield of ecosystem ecology.
This is the study of how "the environment, both physical and biological, interacts with the physiology of an organism. It includes the effects of climate and nutrients on physiological processes in both plants and animals, and has a particular focus on how physiological processes scale with organism size".
Swarm behaviour is a collective behaviour exhibited by animals of similar size which aggregate together, perhaps milling about the same spot or perhaps migrating in some direction. Swarm behaviour is commonly exhibited by insects, but it also occurs in the flocking of birds, the schooling of fish and the herd behaviour of quadrupeds. It is a complex emergent behaviour that occurs when individual agents follow simple behavioral rules.
Recently, a number of mathematical models have been discovered which explain many aspects of the emergent behaviour. Swarm algorithms follow a Lagrangian approach or an Eulerian approach. The Eulerian approach views the swarm as a field, working with the density of the swarm and deriving mean field properties. It is a hydrodynamic approach, and can be useful for modelling the overall dynamics of large swarms. However, most models work with the Lagrangian approach, which is an agent-based model following the individual agents (points or particles) that make up the swarm. Individual particle models can follow information on heading and spacing that is lost in the Eulerian approach. Examples include ant colony optimization, self-propelled particles and particle swarm optimization
The British biologist Alfred Russel Wallace is best known for independently proposing a theory of evolution due to natural selection that prompted Charles Darwin to publish his own theory. In his famous 1858 paper, Wallace proposed natural selection as a kind of feedback mechanism which keeps species and varieties adapted to their environment.
The action of this principle is exactly like that of the centrifugal governor of the steam engine, which checks and corrects any irregularities almost before they become evident; and in like manner no unbalanced deficiency in the animal kingdom can ever reach any conspicuous magnitude, because it would make itself felt at the very first step, by rendering existence difficult and extinction almost sure soon to follow.
The cybernetician and anthropologist Gregory Bateson observed in the 1970s that, though writing it only as an example, Wallace had "probably said the most powerful thing that’d been said in the 19th Century". Subsequently, the connection between natural selection and systems theory has become an area of active research.
In contrast to previous ecological theories which considered floods to be catastrophic events, the river flood pulse concept argues that the annual flood pulse is the most important aspect and the most biologically productive feature of a river's ecosystem.
Theoretical ecology draws on pioneering work done by G. Evelyn Hutchinson and his students. Brothers H.T. Odum and E.P. Odum are generally recognised as the founders of modern theoretical ecology. Robert MacArthur brought theory to community ecology. Daniel Simberloff was the student of E.O. Wilson, with whom MacArthur collaborated on The Theory of Island Biogeography, a seminal work in the development of theoretical ecology.
Simberloff added statistical rigour to experimental ecology and was a key figure in the SLOSS debate, about whether it is preferable to protect a single large or several small reserves. This resulted in the supporters of Jared Diamond's community assembly rules defending their ideas through Neutral Model Analysis. Simberloff also played a key role in the (still ongoing) debate on the utility of corridors for connecting isolated reserves.
Stephen Hubbell and Michael Rosenzweig combined theoretical and practical elements into works that extended MacArthur and Wilson's Island Biogeography Theory - Hubbell with his Unified Neutral Theory of Biodiversity and Biogeography and Rosenzweig with his Species Diversity in Space and Time.
A tentative distinction can be made between mathematical ecologists, ecologists who apply mathematics to ecological problems, and mathematicians who develop the mathematics itself that arises out of ecological problems.
Some notable theoretical ecologists can be found in these categories:
Alan Hastings is a mathematical ecologist and distinguished professor in the Department of Environmental Science and Policy at the University of California, Davis. In 2005 he became a fellow of the American Academy of Arts and Sciences and in 2006 he won the Robert H. MacArthur Award.In 2008 he founded the journal Theoretical Ecology, in which he currently holds the position of editor in chief. Formerly, he was co-editor in chief of the Journal of Mathematical Biology. His research expands through many areas in theoretical ecology including spatial ecology, biological invasions, structured populations, and model fitting.Bacterivore
Bacterivores are free-living, generally heterotrophic organisms, exclusively microscopic, which obtain energy and nutrients primarily or entirely from the consumption of bacteria. Many species of amoeba are bacterivores, as well as other types of protozoans. Commonly, all species of bacteria will be prey, but spores of some species, such as Clostridium perfringens, will never be prey, because of their cellular attributes.Branching process
In probability theory, a branching process is a type of mathematical object known as a stochastic process, which consists of collections of random variables. The random variables of a stochastic process are indexed by the natural numbers. The original purpose of branching processes was to serve as a mathematical model of a population in which each individual in generation n produces some random number of individuals in generation n + 1, according, in the simplest case, to a fixed probability distribution that does not vary from individual to individual. Branching processes are used to model reproduction; for example, the individuals might correspond to bacteria, each of which generates 0, 1, or 2 offspring with some probability in a single time unit. Branching processes can also be used to model other systems with similar dynamics, e.g., the spread of surnames in genealogy or the propagation of neutrons in a nuclear reactor.
A central question in the theory of branching processes is the probability of ultimate extinction, where no individuals exist after some finite number of generations. Using Wald's equation, it can be shown that starting with one individual in generation zero, the expected size of generation n equals μn where μ is the expected number of children of each individual. If μ < 1, then the expected number of individuals goes rapidly to zero, which implies ultimate extinction with probability 1 by Markov's inequality. Alternatively, if μ > 1, then the probability of ultimate extinction is less than 1 (but not necessarily zero; consider a process where each individual either has 0 or 100 children with equal probability. In that case, μ = 50, but probability of ultimate extinction is greater than 0.5, since that's the probability that the first individual has 0 children). If μ = 1, then ultimate extinction occurs with probability 1 unless each individual always has exactly one child.
In theoretical ecology, the parameter μ of a branching process is called the basic reproductive rate.Climax community
In ecology, climax community, or climatic climax community, is a historic term for a biological community of plants, animals, and fungi which, through the process of ecological succession in the development of vegetation in an area over time, have reached a steady state. This equilibrium was thought to occur because the climax community is composed of species best adapted to average conditions in that area. The term is sometimes also applied in soil development. Nevertheless, it has been found that a "steady state" is more apparent than real, particularly if long-enough periods of time are taken into consideration. Notwithstanding, it remains a useful concept.
The idea of a single climax, which is defined in relation to regional climate, originated with Frederic Clements in the early 1900s. The first analysis of succession as leading to something like a climax was written by Henry Cowles in 1899, but it was Clements who used the term "climax" to describe the idealized endpoint of succession.Dominance (ecology)
Ecological dominance is the degree to which a taxon is more numerous than its competitors in an ecological community, or makes up more of the biomass.
Most ecological communities are defined by their dominant species.
In many examples of wet woodland in western Europe, the dominant tree is alder (Alnus glutinosa).
In temperate bogs, the dominant vegetation is usually species of Sphagnum moss.
Tidal swamps in the tropics are usually dominated by species of mangrove (Rhizophoraceae)
Some sea floor communities are dominated by brittle stars.
Exposed rocky shorelines are dominated by sessile organisms such as barnacles and limpets.Ecological Complexity
Ecological Complexity is a quarterly peer-reviewed scientific journal covering the field of biocomplexity in the environment and theoretical ecology with special attention to papers that integrate natural and social processes at various spatio-temporal scales. The founding editor was Bai-Lian (Larry) Li (University of California at Riverside) and the current editor-in-chief is Sergei Petrovskii (University of Leicester).Ecological stability
An ecosystem is said to possess ecological stability (or equilibrium) if it is capable of returning to its equilibrium state after a perturbation (a capacity known as resilience) or does not experience unexpected large changes in its characteristics across time. Although the terms community stability and ecological stability are sometimes used interchangeably, community stability refers only to the characteristics of communities. It is possible for an ecosystem or a community to be stable in some of their properties and unstable in others. For example, a vegetation community in response to a drought might conserve biomass but lose biodiversity.Stable ecological systems abound in nature, and the scientific literature has documented them to a great extent. Scientific studies mainly describe grassland plant communities and microbial communities. Nevertheless, it is important to mention that not every community or ecosystem in nature is stable. Also, noise plays an important role on biological systems and, in some scenarios, it can fully determine their temporal dynamics.
The concept of ecological stability emerged in the first half of the 20th century. With the advancement of theoretical ecology in the 1970s, the usage of the term has expanded to a wide variety of scenarios. This overuse of the term has led to controversy over its definition and implementation.In 1997, Grimm and Wissel made an inventory of 167 definitions used in the literature and found 70 different stability concepts. One of the strategies that these two authors proposed to clarify the subject is to replace ecological stability with more specific terms, such as constancy, resilience and persistence. In order to fully describe and put meaning to a specific kind of stability, it must be looked at more carefully. Otherwise the statements made about stability will have little to no reliability because they would not have information to back up the claim. Following this strategy, an ecosystem which oscillates cyclically around a fixed point, such as the one delineated by the predator-prey equations, would be described as persistent and resilient, but not as constant. Some authors, however, see good reason for the abundance of definitions, because they reflect the extensive variety of real and mathematical systems.Feeding frenzy
In ecology, a feeding frenzy occurs when predators are overwhelmed by the amount of prey available. For example, a large school of fish can cause nearby sharks, such as the lemon shark, to enter into a feeding frenzy. This can cause the sharks to go wild, biting anything that moves, including each other or anything else within biting range. Another functional explanation for feeding frenzy is competition amongst predators. This term is most often used when referring to sharks or piranhas. It has also been used as a term within journalism.Jacqueline McGlade
Jacqueline Myriam McGlade (born May 30, 1955) is a British-born Canadian marine biologist and environmental informatics professor. Her research concerns the spatial and nonlinear dynamics of ecosystems, climate change and scenario development. She is currently Professor of Resilience and Sustainable Development at the University College London Institute for Global Prosperity and Faculty of Engineering, and Professor and Director of the Sekenani Research Centre of the Maasai Mara University, Kenya.
She was Executive Director of the European Environment Agency from 2003-2013, where she was on leave from her post as Professor of Environmental Informatics at University College London.
Between 2014 and 2017 she was Chief Scientist and Director of the Science Division of the United Nations Environment Programme based in Nairobi.Janzen–Connell hypothesis
The Janzen–Connell hypothesis is a widely accepted explanation for the maintenance of tree species biodiversity in tropical rainforests. It was published independently in the early 1970s by Daniel Janzen and Joseph Connell. According to their hypothesis, host-specific herbivores, pathogens, or other natural enemies (often referred to as predators) make the areas near a parent tree (the seed producing tree) inhospitable for the survival of seedlings. These natural enemies are referred to as 'distance-responsive predators' if they kill seeds or seedlings near the parent tree, or 'density-dependent predators' if they kill seeds or seedlings where they are most abundant (which is typically near the parent tree). Such predators can prevent any one species from dominating the landscape, because if that species is too common, there will be few safe places for its seedlings to survive. However, because the predators are host-specific (also called specialists), they will not harm other tree species. As a result, if a species becomes very rare, then more predator-free areas will become available, giving that species' seedlings a competitive advantage. This negative feedback allows the tree species to coexist, and can be classified as a stabilizing mechanism.
The Janzen-Connell hypothesis has been called a special case of keystone predation, predator partitioning or the pest pressure hypothesis. The pest pressure hypothesis states that plant diversity is maintained by specialist natural enemies. The Janzen-Connell hypothesis expands on this, by claiming that the natural enemies are not only specialists, but also are distance-responsive or density-responsive.This mechanism has been proposed as promoting diversity of forests as it promotes survival of a number of different plant species within one localized region. While previously thought to explain the high diversity of tropical forests in particular, subsequent research has demonstrated the applicability of the Janzen–Connell hypothesis in temperate settings as well. The Black Cherry is one such example of a temperate forest species whose growth patterns can still be explained by the Janzen–Connell hypothesis.Limiting similarity
Limiting similarity (informally "limsim") is a concept in theoretical ecology and community ecology that proposes the existence of a maximum level of niche overlap between two given species that will allow continued coexistence.
This concept is a corollary of the competitive exclusion principle, which states that, controlling for all else, two species competing for exactly the same resources cannot stably coexist. It assumes normally-distributed resource utilization curves ordered linearly along a resource axis, and as such, it is often considered to be an oversimplified model of species interactions. Moreover, it has theoretical weakness, and it is poor at generating real-world predictions or falsifiable hypotheses. Thus, the concept has fallen somewhat out of favor except in didactic settings (where it is commonly referenced), and has largely been replaced by more complex and inclusive theories.Metabolic theory of ecology
The metabolic theory of ecology (MTE) is an extension of Kleiber's law and posits that the metabolic rate of organisms is the fundamental biological rate that governs most observed patterns in ecology. MTE is part of a larger set of theory known as metabolic scaling theory that attempts to provide a unified theory for the importance of metabolism in driving pattern and process in biology from the level of cells all the way to the biosphere.
MTE is based on an interpretation of the relationships between body size, body temperature, and metabolic rate across all organisms. Small-bodied organisms tend to have higher mass-specific metabolic rates than larger-bodied organisms. Furthermore, organisms that operate at warm temperatures through endothermy or by living in warm environments tend towards higher metabolic rates than organisms that operate at colder temperatures. This pattern is consistent from the unicellular level up to the level of the largest animals and plants on the planet.
In MTE, this relationship is considered to be the single constraint that defines biological processes at all levels of organization (from individual up to ecosystem level), and is a macroecological theory that aims to be universal in scope and application.Population biology
Population biology is an interdisciplinary field combining the areas of ecology and evolutionary biology. Population biology draws on tools from mathematics, statistics, genomics, genetics, and systematics. Population biologists study allele frequency changes (evolution) within populations of the same species (population genetics), and interactions between populations of different species (ecology).Quantitative ecology
Quantitative ecology is the application of advanced mathematical and statistical tools to any number of problems in the field of ecology. It is a small but growing subfield in ecology, reflecting the demand among practicing ecologists to interpret ever larger and more complex data sets using quantitative reasoning. Quantitative ecologists might apply some combination of deterministic or stochastic mathematical models to theoretical questions or they might use sophisticated methods in applied statistics for experimental design and hypothesis testing. Typical problems in quantitative ecology include estimating the dynamics and status of wild populations, modeling the impacts of anthropogenic or climatic change on ecological communities, and predicting the spread of invasive species or disease outbreaks.
Quantitative ecology, which mainly focuses on statistical and computational methods for addressing applied problems, is distinct from theoretical ecology which tends to explore focus on understanding the dynamics of simple mechanistic models and their implications for a general set of biological systems using mathematical arguments.R* rule (ecology)
The R* rule (also called the resource-ratio hypothesis) is a hypothesis in community ecology that attempts to predict which species will become dominant as the result of competition for resources. The hypothesis was formulated by American ecologist David Tilman. It predicts that if multiple species are competing for a single limiting resource, then whichever species can survive at the lowest equilibrium resource level (i.e., the R*) can outcompete all other species. If two species are competing for two resources, then coexistence is only possible if each species has a lower R* on one of the resources. For example, two phytoplankton species may be able to coexist if one is more limited by nitrogen, and the other is more limited by phosphorus.
A large number of experimental studies have attempted to verify the predictions of the R* rule. Many studies have shown that when multiple plankton are grown together, the species with the lowest R* will dominate, or coexist if they are limited by multiple resources. There are fewer tests of the R* rule in communities of larger organisms, in part because of the difficulty of creating a situation in which only a single resource is limiting. However, some studies have used the R* rule with multiple resources to predict which groups of plants will be able to coexist.Recruitment (biology)
In biology, especially marine biology, recruitment occurs when a juvenile organism joins a population, whether by birth or immigration, usually at a stage whereby the organisms are settled and able to be detected by an observer.There are two types of recruitment: closed and open.In the study of fisheries, recruitment is "the number of fish surviving to enter the fishery or to some life history stage such as settlement or maturity".Robert Ulanowicz
Robert Edward Ulanowicz is an American theoretical ecologist and philosopher of Polish descent who in his search for a unified theory of ecology has formulated a paradigm he calls Process Ecology. He was born September 17, 1943 in Baltimore, Maryland.
He served as Professor of Theoretical Ecology at the University of Maryland Center for Environmental Science's Chesapeake Biological Laboratory in Solomons, Maryland until his retirement in 2008. Ulanowicz received both his BS and PhD in chemical engineering from Johns Hopkins University in 1964 and 1968, respectively.
Ulanowicz currently resides in Gainesville, Florida, where he holds a Courtesy Professorship in the Department of Biology at the University of Florida. Since relocating to Florida, Ulanowicz has served as a scientific advisor to the Howard T. Odum Florida Springs Institute, an organization dedicated to the preservation and welfare of Florida's natural springs.Storage effect
The storage effect is a coexistence mechanism proposed in the ecological theory of species coexistence, which tries to explain how such a wide variety of similar species are able to coexist within the same ecological community or guild. The storage effect was originally proposed in the 1980s to explain coexistence in diverse communities of coral reef fish, however it has since been generalized to cover a variety of ecological communities. The theory proposes one way for multiple species to coexist: in a changing environment, no species can be the best under all conditions. Instead, each species must have a unique response to varying environmental conditions, and a way of buffering against the effects of bad years. The storage effect gets its name because each population "stores" the gains in good years or microhabitats (patches) to help it survive population losses in bad years or patches. One strength of this theory is that, unlike most coexistence mechanisms, the storage effect can be measured and quantified, with units of per-capita growth rate (offspring per adult per generation).The storage effect can be caused by both temporal and spatial variation. The temporal storage effect (often referred to as simply "the storage effect") occurs when species benefit from changes in year-to-year environmental patterns, while the spatial storage effect occurs when species benefit from variation in microhabitats across a landscape.Unified neutral theory of biodiversity
The unified neutral theory of biodiversity and biogeography (here "Unified Theory" or "UNTB") is a hypothesis and the title of a monograph by ecologist Stephen Hubbell. The hypothesis aims to explain the diversity and relative abundance of species in ecological communities, although like other neutral theories of ecology, Hubbell's hypothesis assumes that the differences between members of an ecological community of trophically similar species are "neutral", or irrelevant to their success. This implies that biodiversity arises at random, as each species follows a random walk. The hypothesis has sparked controversy, and some authors consider it a more complex version of other null models that fit the data better.Neutrality means that at a given trophic level in a food web, species are equivalent in birth rates, death rates, dispersal rates and speciation rates, when measured on a per-capita basis. This can be considered a null hypothesis to niche theory. Hubbell built on earlier neutral concepts, including MacArthur & Wilson's theory of island biogeography and Gould's concepts of symmetry and null models.An ecological community is a group of trophically similar, sympatric species that actually or potentially compete in a local area for the same or similar resources. Under the Unified Theory, complex ecological interactions are permitted among individuals of an ecological community (such as competition and cooperation), provided that all individuals obey the same rules. Asymmetric phenomena such as parasitism and predation are ruled out by the terms of reference; but cooperative strategies such as swarming, and negative interaction such as competing for limited food or light are allowed (so long as all individuals behave in the same way).
The Unified Theory also makes predictions that have profound implications for the management of biodiversity, especially the management of rare species.The theory predicts the existence of a fundamental biodiversity constant, conventionally written θ, that appears to govern species richness on a wide variety of spatial and temporal scales.