WALTHAM, Mass., Apr. 17, 2007 – Researchers at the University of California, Santa Barbara (UCSB) are harnessing supercomputers and electronic circuit theory to help save wildlife from ever-shrinking habitats in an emerging scientific field called “computational ecology.” The project is run by the University’s National Center for Ecological Analysis and Synthesis (NCEAS).
NCEAS scientists are applying electronic circuit theory to model wildlife migration and gene flow across fragmented landscapes. The research could be instrumental in smart conservation planning, helping organizations decide which lands to preserve or restore – and where to best invest their tight conservation budgets – in order to preserve habitat and connectivity for wildlife populations.
Due to the massive volume of landscape data and the novel application of algorithms from circuit theory, NCEAS is working to speed up their code using state of the art sparse linear solvers, graph computations, vectorization and parallelization of their code with Interactive Supercomputing Inc.’s (ISC) Star-P™. The result has been a dramatic reduction in computing time from days to minutes on their 8-core server.
“It turns out that circuit theory shares a surprising number of properties with ecological theory describing animal movements and connectivity,” said Brad McRae, the NCEAS project leader. “We can now represent landscapes as conductive surfaces – with features like forests and highways having different resistance to movement – and analyze connectivity across them using powerful circuit algorithms. Unlike standard conservation planning tools, these algorithms simultaneously incorporate all possible pathways when predicting how corridors, barriers, and other features affect movement and gene flow over large areas.”
Corridors are areas that connect important habitats in human-altered landscapes. They provide natural avenues along which animals can travel, plants can propagate, genetic interchange can occur, species can move in response to environmental changes and natural disasters, and threatened populations can be replenished from other areas. A good example is “Y2Y,” or the Yellowstone to Yukon corridor, where U.S. and Canadian conservation organizations are trying to identify which habitats to conserve to protect species from harmful decline or extinction.
In applying their software to these problems, NCEAS scientists have modeled mountain lion movements in Southern California to identify important connective habitats and corridors. In Central America they modeled how habitat connectivity affects gene flow among threatened populations of mahogany throughout the species’ range. They are also analyzing connectivity among populations of wolverines, kit foxes and jaguars. For each species, researchers analyze geographic datasets representing habitat suitability over vast areas – in some cases spanning entire continents.
The challenge was choosing between how large or how finely-scaled the maps should be, explained McRae. “Even a relatively small region like the three-county area of Southern California can contain millions of raster cells, but our computing resources limited how finely we could grid those locations. While a mountain lion might perceive its habitat at a scale of about 100 meters, we originally had to increase the cell sizes to around a kilometer to keep our data requirements manageable,” he said. “And even at these lower resolutions, running the models on a single-processor computer without optimized code took three days to complete.”
A key step of the NCEAS simulations is a computation on a large graph (or network) that represents the connectivity of the landscape. UCSB Computer Scientist Viral Shah worked with the NCEAS researchers to integrate their code with GAPDT, a Star-P toolbox for graph computation developed by Shah and John Gilbert of UCSB’s Combinatorial Scientific Computing Laboratory together with ISC Vice President of Advanced Research Steve Reinhardt. Said Shah, “The graph toolbox allows researchers who are not experts in the field of combinatorial scientific computing to leverage its methods in their own research.”
“The combination of vectorization with Star-P’s graph toolbox and efficient sparse linear solvers has allowed scientists to take full advantage of their 8-processor server (with 32 gigabytes of memory) to run their models,” says Reinhardt. “The result: scientists can now model larger maps with much finer grids, while cutting computing time from three days to about 15 minutes for typical problems.”
Star-P is an interactive parallel computing platform that lets scientists use their preferred desktop tools – MATLAB®, Python, R and others – to model landscape connectivity, but run the models interactively while gaining the benefits of scalable HPC solutions. It eliminates the need to re-program the models in C, FORTRAN or MPI languages to run on the parallel computer, dramatically improving the researchers’ productivity.
“Habitat reduction and fragmentation are accelerating the decline of many native wildlife species,” said Ilya Mirman, vice president of marketing at ISC. “NCEAS’ novel approach of applying circuit theory to solve this problem blends well with Star-P’s novel way of making parallel computing available to anyone.”
About the National Center for Ecological Analysis and Synthesis
The National Center for Ecological Analysis and Synthesis (NCEAS) provides the intellectual atmosphere, facilities, equipment, and staff support to promote the analysis and synthesis of ecological information. Since 1995, NCEAS has hosted 3,500 individuals and supported 400 projects that have yielded more than 1,000 scientific articles. The projects have produced a wide array of outcomes, from specific results to general knowledge about ecology and its application to conservation and the management of resources. The Center has engaged hundreds of graduate students and grade school children, and has developed information access tools that are becoming the standard for the discipline.
About Interactive Supercomputing
Interactive Supercomputing (ISC) launched in 2004 to commercialize Star-P, an interactive parallel computing platform. With automatic parallelization and interactive execution of existing desktop simulation applications, Star-P merges two previously distinct environments – desktop computers and high performance servers – into one. Based in Waltham, Mass., the privately held company markets Star-P for a range of biomedical, financial, and government laboratory research applications. Additional information is available at http://www.interactivesupercomputing.com