Fighting Fire with Data

By Barry A. Cipra

SIAM News, July 25, 2004


Many times just the practical problems of gathering accurate data on local fire conditions can be as difficult and complex as the mathematics that follow.
---Norman Maclean
, Young Men and Fire

Smoke from a forest fire in the Flatiron Mountains, at the base of the foothills of the Rocky Mountains near Boulder, Colorado. From the University Corporation for Atmospheric Research.

It's fire season in the forests and wildlands of America. Fit young men and women decked out in protective gear work grueling hours to keep the fires in check. (The days are gone when wildfire was seen as an unnecessary evil, to be prevented when possible and immediately suppressed when not, but fires still require management, especially when they threaten human habitation.) Into scenes of searing heat and against the occasional backdrop of towering flames, firefighters carry axes, shovels, chain-saws, and other tools of the trade.

A new, four-year, $2 million project funded by the National Science Foundation now aims to provide firefighters with one more tool: accurate, timely information.

The project, Data Dynamic Simulation for Disaster Management (DDSDM), is a collaborative effort involving experts in computational mathematics, stochastic modeling, meteorology and wildfire science, remote sensing, data assimilation, and secure communication infrastructure. The principal investigator is Jan Mandel, a professor of mathematics at the University of Colorado, Denver. Other members of the group include Lynn Bennethum, Leopoldo Franca, and Craig Johns of UC Denver, Janice Coen of the National Center for Atmospheric Research, Robert Kremens and Anthony Vodacek of the Rochester Institute of Technology, Craig Douglas of the University of Kentucky, and Wei Zhao of Texas A&M University. Their objective is to develop computational and communication tools for real-time, data-driven simulations of wildfires, with predictions to be made immediately available to firefighters in the field.

At the heart of the project is a coupled weather-wildfire model developed by Coen and colleagues at NCAR. Weather plays an obvious role in the spread or containment of fire: A sudden gust of wind can cause flames to jump a firebreak; a downpour can extinguish a blaze altogether. Less obvious, though well known to firefighters, is the role fire plays in creating its own local weather patterns-"local" being a relative term when thousands of acres are burning. Updrafts of superheated air shift winds and carry sun-screening particulate matter high into the sky. The meteorological part of the model uses terrain-following coordinates to track airflow over mountains and interactive nested grids to zoom in on details of vortices in the fireline.

At present, the wildfire part of the NCAR model is empirical, using formulas for the rates at which different fuels burn (grasses burn quickly, logs slowly) and the rate of fire spread as a function of wind speed and direction. Each atmospheric grid cell at ground level is divided into fuel cells; the cells are assigned fuel characteristics from a list known as the Anderson fuel classes, which range from grass to slash (the residue of felled trees). Modifications planned by the DDSDM team will result in a model based on stochastic partial differential equations for fire physics. The basic variables are temperature T and fuel supply S (really a vector of various fuels) as a function of position and time. The equation for temperature change incorporates a random sum of Dirac delta functions representing heat transported by flying embers.

To be useful for more than simulations, the weather-wildfire model needs access to data of various types, including maps, fuel surveys, and weather and fire information. Maps and fuel surveys are available on the Internet in various geographic information systems files, but these files will have to be translated into data that can be used in the model. Similarly, weather data can be culled online from various sites, but the information will need to be converted into a format consistent with the model.

Detailed information about fires themselves can be obtained in two ways: thermal and infrared image data from airborne sensors, and field data from a network of inexpensive autonomous fire detectors, which could be dropped from planes or positioned by firefighters on the ground. Kremens and Vodacek are leading a team at RIT that has been developing rapidly deployable, GPS-equipped ground equipment from commercial components. Their prototype has sensors for smoke and carbon monoxide, but detectors could also be equipped to measure temperature, humidity, and wind speed-it all depends on how much money you have to burn (literally).

The idea is for all this data to be fed into a supercomputer at a remote location; simulations will then be run on the weather-wildfire model and results transmitted to people in the field, via PDAs or other wireless devices. That last step is, of course, the whole point. It raises its own questions as to what information should be sent and how it should be presented. For help in deciding where to allocate resources, for example, fire-fighters might want to see maps showing the likely spread of a fire over the next 12 hours. Getting that information out also involves issues of fault-tolerant and secure communication. Zhao is overseeing the effort to ensure that firefighters will be able to stay in touch while they are in the trenches.

Running a real-time simulation as real-time data is pouring in is itself a challenging problem. In dynamic data assimilation, "the model is running all the time and it incorporates new data as soon as it arrives," the researchers write in a summary of the project. "Uncertainty is dominant," they continue: "Important processes are not modeled, there are measurement and other errors in the data, the system is heavily nonlinear and ill-posed, and there are multiple possible outcomes." The solution, they believe, lies in sequential Bayesian filtering, in which the state of the system is a probability distribution p(x) approximated by an ensemble of time-state vectors x, which are snapshots at various times of what the computer model produces. (Multiple copies of the model, which already has stochastic elements, will be run in parallel.) The incoming data and their error bounds are understood as probability distributions, which, together with the relation between the time-states x and the observable quantities y, are used in Bayes's theorem to update the ensemble that represents the probability distribution p(x).

The research plan calls for a proof-of-principle field demo in 2007. If all goes well, future firefighters could have a powerful new tool for dealing with wildfire. A steady stream of data could prove to be the most effective way to put out unwanted fires.

Simulation of the Big Elk Fire near Pinewood Springs, Colorado, July 17, 2002, from the coupled atmosphere-fire model of the National Center for Atmospheric Research. Gray and white show hot air and smoke created by the fire. The arrows show the direction and speed of winds near thesurface, including those created by the fire. From Tim Scheitlin, NCAR.