The study of who eats whom — known as “food webs” — is essential to understanding and preserving habitats.
The traditional way to evaluate the overall health of an ecosystem is to conduct a biological inventory, note the robustness of keystone species and establish a pecking order. Photosynthesis produces grass, zebras eat grass, and lions eat the zebras until the circle of life is complete.
Such practices have been the foundation of resource management plans for commercial fisheries and national parks for decades. If links in the chain are broken, the thinking goes, the entire food web starts to unravel, leading to famine, disease and ultimately environmental collapse.
The demise of seemingly unrelated animals can have a devastating cascading effect on entire food systems, leading scientists to wonder, for example, whether the mighty lion trumps the lowly dung beetle in providing necessary environmental services on the African savanna.
Figuring this out in advance has proven to be difficult. Since building accurate computer models based on the myriad ways species interact is daunting. Until recently there have been more variables to the diversity of life than there is adequate processing power.
Drs. Stefano Allesina of the National Center for Ecological Analysis and Synthesis and Mercedes Pascual of the University of Michigan may have a partial solution to that problem by finding the quickest way to dismantle an ecosystem – at least on paper. Their research, published Sept. 4 in the academic journal PLoS Computational Biology, identifies with deadly accuracy which species are essential to the survival of ecosystems.
“Species are not isolated in deep space. They are always interacting with each other, and therefore any modification we make on the environment or any impact we have on a single species is important,” Allesina said.
The co-authors are food web experts. Their novel approach to co-extinction is associated with the emerging field of computational ecology. It’s an academic discipline that takes field biology into the realm of networks and super computers, applying the logic behind higher math to address ecological questions.
In computer lingo, each species is a node, connective points in vast network. Disrupting nodes leads to a loss in functionality and eventual system failure.
His “aha” moment arrived while reading a paper in a journal of applied math that explained how the Google algorithm PageRank worked. “I realized it was almost the same idea,” said Allesina. “Species are important if they point to other species.”
Software developers rely on PageRank to measure the utility and value of Web pages. It’s premised on the reasoning that incoming links indicate the intrinsic worth of Web pages to other pages — more links means the page is valued, fewer means it’s not.
Inspired by PageRank, their Google-like algorithm is designed to mimic the effect of the loss of a species has on a food web. It’s adapted to determine the importance of species to each other, taking into account not just their primacy of place, but the number flora and fauna that rely on it for survival and counting the number of links in order to determine a species’ value to the overall health of an ecosystem.
Key to understanding their algorithm is understanding the sensitive equilibrium of ecosystems.
The researches had to adapt the algorithm to match the hierarchical nature of food webs. Energy expended to grow grass that zebras eat ends with the lion. In order to approximate dead ends in the food chain, they factored in the “brown cycle,” the decomposition of organic matter, which Allesina termed a “root node,” based on the premise that all animals excrete and eventually die, recycling nutrients back into the ecosystem.
When tested against published accounts of ecosystem collapse their algorithm proved to be accurate in 11 out of 12 cases, outpacing previous models designed to measure the effects of stress placed on food webs due to species loss. The PageRank tweak arrived at the same outcome.
The researchers ran their algorithm across a broad spectrum of ecosystems where the effects of species loss are known, ranging from the Coachella Valley near Palm Springs to the Chesapeake Bay in the Mid-Atlantic region. They achieved similar results in marine, rain forest and grassland regions.
In the 12th case, Allesina attributed its failure to “confounding” anomalies in the food chain such as cannibalism, which is not reflected in the algorithm and which could be compared to “gaming the system” in the online world in order to gain a higher page rank.
Math and ecology working together have been around a long time; the computer models mimicking complex foods systems are just beginning to have the same predictive success as field biology studies, surmised Neo Martinez, the director of the Pacific Ecoinformatics and Computational Ecology Lab at Berkeley.
According to Martinez, the PageRank-inspired algorithm is a refinement of existing computer models. “The Google algorithm is a quick and efficient way of pretty precisely figuring out who’s most important,” he said.
In the field, the algorithm could be used to make preliminary assessments regarding ecosystems under stress. Giving resource managers the ability to allocate resources where they are most needed, if they can.
“It’s a first step,” Allesina said.
Sign up for our free e-newsletter.
Are you on Facebook? Become our fan.
Follow us on Twitter.