The war being waged against wildfires from Southern California to Greece and Australia is almost as complex as the infernos themselves. Innovative computer mapping tools advance, as do airborne imaging techniques that can look straight through black smoke for views of emerging dangers no firefighter ever sees. However, some crews battle blazes on bulldozers older than they are, and funding is tight all around. Still, the breakthroughs keep coming.
Part I: THE POWER OF ‘LOOK-DOWN’ TECHNOLOGY
Part II: UNDERSTANDING WILDFIRE BEHAVIOR AND PREDICTING ITS SPREAD
Part III: WHAT’S REALLY HAPPENING ON U.S. FIRELINES
Part IV: CATCHING WILDFIRE ARSONISTS RED-HANDED
Part V: SMART SOLUTIONS GOING FORWARD
Sophisticated imaging from satellites and unmanned aerial vehicles (or UAVs) can help real-time firefighting efforts enormously, but research scientists’ efforts to better understand fire spread and to create accurate predictive computer programs may hold the keys to a whole new range of breakthroughs.
The quest to understand fire and predict its movement certainly is not new. In some early attempts, which sound primitive today, firefighters estimated burn rates with tables built on data extrapolated from laboratory experiments calculating, for instance, the burn rate of pine needles in wind tunnels.
Things looked set to change dramatically in 2002 when two atmospheric scientists — Rod Linn of Los Alamos National Laboratory and Michael Bradley of Lawrence Livermore — were busy working on a physics-based, computer-run modeling system to predict wildfire behavior. Designing it on what was then the nation’s largest supercomputer, they were playing a whole new ballgame and hoped to succeed well enough to combat wildfire damage as well as the loss of life, property and natural resources.
Initially, they used FIRETEC, a physics-based model (created by Linn) to simulate wildfire behavior, factoring in changing weather and the effects of complex terrain. FIRETEC was encouragingly successful at simulating 1991’s deadly Oakland, Calif., wildfires after the fact, and it seemed they were onto something potentially important.
The researchers then coupled FIRETEC with HIGRAD, a program that adds in the atmospheric aspects of a given fire and the effects of smoke plumes. The combo’s effectiveness was validated by again inputting documented information about the Oakland fires. Results were exciting. Imagine being able to know what was likely before the fire.
By simulating both past fires and hypothetical future fires, scientists hoped they could eventually predict wildfire spread accurately enough to help first responders. Advances here might enable fire chiefs to weigh different deployment scenarios to maximize their limited manpower and resources. They could also help plan the timing of prescribed or controlled burns (those deliberately set by firefighters to clear fire breaks that could stop a fire in its tracks) to factor in safety as well as effectiveness. For example, a sudden influx of dense smoke from a controlled burn can drastically reduce highway visibility and cause fatal accidents.
2000’s Cerro Grande blaze, one of New Mexico’s most destructive wildfires ever, began as a National Park Service controlled burn in Bandelier National Monument. As a result of high winds, it escaped control. Although no one died, the blaze eventually burned more than 47,000 acres and destroyed 235 homes in Los Alamos, even endangering the nuclear laboratory itself.
Despite the potential of their research, a funding shortfall halted Linn and Bradley’s work, according to Bradley. While fluctuating funding streams are a stumbling block for almost everyone involved in the war on wildfires, applied research conducted by scientists like Linn and Phil Cunningham, a former Florida State University associate professor of meteorology who has just moved to LANL to work with Linn, continues in government and university lab settings despite interruptions. Cunningham, for example, has written papers on efforts to explain how fire behavior couples with atmosphere.
Advances in research, however, face another familiar obstacle: computer processing capacity. Few supercomputers in the U.S. can run such complex programs. Roadrunner, produced by IBM and LANL for approximately $120 million to model nuclear explosions, is the best to date. It has world-record data processing speeds, and by using around one-third of the electricity of its nearest competitor, it can reportedly save millions of dollars a year.
Researchers are keen to use Roadrunner in a host of ways, including studying in real time the entire human visual cortex, and to help the U.S. certify the reliability of its nuclear weapons without conducting underground nuclear tests.
It’s not only competing for supercomputer time that makes advancing fire spread and prediction research difficult. Advances in research models are still so “computationally intensive that it’s completely impractical from a real-time prediction standpoint,” Cunningham explained by phone from Florida. “We can’t say, ‘OK, here’s the fire. Let’s run the model quickly and see where it’s going to go.’ But I think the valuable part of our work is to think in terms of the fundamental behavior and scenarios. ‘What happens if …’ and, ‘How does fire behave when you have this or that?'”
Examining the roles of, and impact of, various weather elements on fire represents, he believes, “pretty important advances” in understanding why fires do what they do, how they depend on the atmosphere and what they do to the atmosphere. Researchers are exploring how increasing wind, or reducing humidity, for instance, changes fire behavior.
Another line of research with promising real-world potential is answering why fires suddenly take off in unexpected directions or leave different burn patterns on the land. The spread and patterns – streaks or lines that are commonly visible from the air — aren’t random, said Cunningham. “There are some commonalities between fires in terms of how the land gets burned and some patterns of burning. They’re typically organized, often in lines, roughly along the wind direction.”
Computer Challenges to Fighting Fires
Current computer modeling tools used by the U.S. firefighting community range in complexity from something akin to drawing with paper and pencil to a far more useful program that, when fed certain data, can run in seconds on a desktop and give a respectable ballpark estimate on how fast a fire will spread.
Many wildfire crews and agencies like the U.S. Fire Service in California, U.S. National Park Service and U.S. Bureau of Land Management currently use FARSITE, a Windows-based fire behavior and growth simulator that can be input into GIS systems. It was developed in the 1990s, along with programs like BehavePlus, by the USDA’s Rocky Mountain Fire Research Station Laboratory. These and other programs aren’t even close to perfect but can give an idea of where flames will spread and how fierce they will be.
And, as Mark Finney, FARSITE’s developer, cautioned the Associated Press in 2007: “The model is only as good as the analyst running it. A degree of expertise is required.”
Although the U.S. Fire Service had long used the software, spokesman Matt Mathes also warned the AP that: “All this technology is great, but we still have to rely on the judgment of individual firefighters going through these areas and deciding what is defensible and what is safe to defend.”
FIRETEC has the potential to go so much further in the realms of prediction. Yet taking the giant leap to simplify the model computationally so it runs fast enough to build a practical, close to real-time, predictive model is a brain-teasing goal. “That’s still on the table, as it were,” Cunningham acknowledged, “but it’s a complicated problem.”
Another challenge for researchers that, if met, would have major real-world ramifications is to widen the gap between the largest- and smallest-scale imaging a program can handle simultaneously. Just like the work at San Diego State University’s Viz Lab, there’s a technical clash between encompassing the big mosaic and the end user’s tiny single tile. The greater the gap between scales being used, the more computing power needed.
“Even on supercomputers, you’re really limited,” Cunningham said. “To have your smallest scale at 1 meter and your biggest scale at let’s say 10, or 100, kilometers – you just can’t do it based on the computational power available.”
Peter Sadler, a geology professor in the earth sciences department at the University of California, Riverside, has been researching the impact on fires of the growth of shrubbery, grasses and other combustibles and working with graduate student Greg Miller on wildfire simulations.
“Our programs will not be in the hands of the firefighters,” Sadler wrote in an e-mail, “in the sense that they could use them to predict the short-term behavior of individual fires.” Yet, like Cunningham’s and Linn’s, his research could also turn out to be really important. And while tangible benefits seem remote with some scientific research, with the global wildfire problem, researchers naturally favor exploring, not ignoring, every promising avenue of research.
NEXTMap is a high-resolution, digital elevation model — think detailed terrain mapping and 3-D visualizations. The program has grown significantly since Southern California’s 2007 fires, Kevin Thomas, vice president of marketing at Intermap Technologies, wrote in an e-mail, and his company is now “in the final stages” of building data-rich digital maps covering 2.4 million square kilometers of Western Europe and 8 million square kilometers of the continental United States. Thomas noted that the data already had helped firefighters in Southern California, and he expects use of NEXTMap to grow.
We Need Bodies, Too
If complex computer challenges and lack of research funding delay progress in this field, so, too, do insufficient budgets for hiring the top-notch programmers and fire experts needed to move things forward. Given the extent of wildfire devastation globally, are enough resources committed to this research?
“That would be a no, a very resounding no,” Cunningham said before leaving Florida State University. “My funding has been sporadic and erratic.” Others in the field, who prefer to remain anonymous, see the very structure that exists for funding fire research as inherently flawed. The Forest Service, the primary agency allocating research funds, has its own research labs. What is more, the Forest Service’s budget must stretch to cover both research and national operations — including firefighting. So it is hardly surprising if, in active fire seasons, research budgets quickly feel the pinch, and scientists in this arena in turn feel it, too. But researchers like Linn and Cunningham are committed to maintaining a funding stream and to moving their research forward. True understanding of fire behavior is too important to warrant anything less.
Sign up for our free e-newsletter.
Are you on Facebook? Become our fan.
Follow us on Twitter.