*This piece is the first of a multi-part series by Brian Isom on wildfires. Click to read parts two, three, and four.
Last year, as devastating wildfires burned across much of the Western United States, arguments raged over the key cause of the burns. President Trump lambasted California via Twitter, threatening to cut off relief aid for fire victims if California did not get their act together. President Trump cited “gross mismanagement of the forest” by the state of California as the reason for such devastating fires. This prompted other politicians, including California lawmakers, to respond with claims that the wildfires burning in the west were not an issue of management. Instead, they countered, climate change was to blame. Senator Bernie Sanders (I-VT) stated that “record-breaking wildfires in California have everything to do with climate change.”
Of course, this arguing did little to help the residents of California, who were still reeling from the most destructive wildfire in state history. The Camp Fire, about which President Trump was tweeting, destroyed 14,000 homes and claimed the lives of 86 people. It was the unfortunate ending to a record-breaking wildfire season that cost the federal government over $3 billion in suppression costs alone.
It is understandable why political debates over how to deal with these wildfires have become so heated, considering the major impact this issue has had on our country recently. The current debate, however, misses the point. In reality, both sides are right. Record high temperatures in the west are creating drier conditions and extending the wildfire season. At the same time, historically bad forest management — the result of a misunderstanding by managers about the role of wildfires in ecological systems — combined with strict regulations and growing development have turned many western landscapes into dangerously fire-prone areas.
While we’ve heard for decades that “Only You Can Prevent Wildfires,” the truth is that the longest-running public service announcement in American history fails to address the reality of our future. Climate change is a long-term problem that will take significant time and resources to address. More importantly, enabling more effective forest management practices is an important step in giving fire managers better tools to prepare for and respond to changing wildfire challenges.
The Changing Landscape of Wildfire Management
Many of the wildfire issues we are facing today stem from decisions made over a century ago. Fire policy in the early half of the 20th century was focused entirely on all-out fire suppression. This approach was a response to the Great Fire of 1910, which burned over 3 million acres across Montana and Idaho — a land area roughly 20 times larger than last year’s Camp fire. Over nine-thousand personnel were dispatched to fight the blaze, and ultimately 85 people lost their lives. As more and more Americans were traveling west, fear of more uncontrollable fire outbreaks like the Great Fire led to wildfire managers adopting a rule known as the 10 a.m. policy. This policy was an ultimatum, mandating that all wildfires should be put out by 10 a.m. the day after they were discovered.
The 10 a.m. policy was America’s first truly anti-wildfire policy, setting a precedent for all-out suppression efforts that persisted for nearly 70 years. This policy, however, failed to recognize (or maybe chose not to acknowledge) the ecological benefits wildfires can provide for forest and grasslands. Maybe that seems counterintuitive, but for many ecosystems in the United States, particularly in the west, fire has historically been a natural part of ecological development. Fire acts as a cleansing agent, burning up dead or diseased plants which in turn enrich the soil and spur new plant growth. Eliminating fire from the landscape eliminated those benefits, leading to less-healthy overgrown forests and buildup of dead organic material on forest floors.
This photo of Yosemite Valley provides a great example of what happens to a landscape when you deprive it of fire.
On the left is the valley in 1899, before fire management had made any real impact. On the right is the valley in 2011, the result of 100 years of fire suppression. Notice how densely packed with trees Yosemite Valley is now compared to 1899. All of that overgrowth provides more fuel for fires once they begin to burn, which makes them much more difficult to contain.
It wasn’t until the 1970s that federal agencies began to acknowledge the ecological benefits of wildfires. Development in the west had grown considerably by that point, as had the financial strain of suppression efforts.
Shortly before the Great Fire of 1910, Congress had passed the Forest Fires Emergency Act. The act stipulated:
[A]dvances of money under any appropriation for the Forest Service may be made to the Forest Service and by authority of the Secretary of Agriculture to chiefs of field parties for fighting forest fires in emergency cases…
Congress had essentially handed the Forest Service a blank check to cover their fire expenses. Over the next 60 years, spending on fire suppression had become so expensive that Congress and the Office of Management and Budget directed federal wildfire agencies to focus on improving efficiency. This resulted in the end of the 10 a.m. policy in 1972 and the repeal of the Forest Fires Emergency Act in 1978.
This push for efficiency, coupled with new perceptions about the ecological value of wildfires, caused fire managers to shift from costly all-out suppression efforts to less-expensive prevention measures like forest thinning and controlled burns. They stopped putting out every fire that sprang up and instead opted for a policy of “appropriate suppression response.” That meant allowing some fires to burn naturally if they posed no harm to anyone. This shift in management practices actually led to a decrease in suppression costs during the 1980’s.
History, unfortunately, repeated itself in 1988. Small fires burning in Yellowstone National Park quickly grew out of hand, coalescing into a major conflagration. The fire, which was allowed to burn naturally at first under new management practices, spread more rapidly than expected due to shifting winds and drought conditions. The final bill for the fire was $120 million. This event, like the Great Fire of 1910, shifted focus towards suppressive, rather than preventative, management. This is one of the key factors that has driven suppression costs to record-breaking highs year after year since then.
Modern Impacts of Bad Policy
One of the easiest ways to see the impact of these growing costs is to look at how it is impacting the US Forest Service, the main agency in charge of fighting wildfires. The mission of the United States Forest Service is “to sustain the health, diversity, and productivity of the Nation’s forests and grasslands to meet the needs of present and future generations.”
In order to meet that goal, USFS must oversee a number of operations, such as managing trails, protecting watersheds, and encouraging conservation and economic development on public lands. However, they are seeing more and more of their budget shifting toward fire-related costs. In 1995, 15 percent of USFS’ budget went toward fire spending. In 2015, that number had grown to 52 percent. By 2025, it is expected that the USFS will have to dedicate 67 percent of their budget to wildfire management. In a 2015 report, the Forest Service acknowledged what kind of effect this would have on their ability to meet other obligations:
[In] just 10 years, two out of every three dollars the Forest Service gets from Congress as part of its appropriated budget will be spent on fire programs. As more and more of the agency’s resources are spent each year to provide the firefighters, aircraft, and other assets necessary to protect lives, property, and natural resources from catastrophic wildfires, fewer and fewer funds and resources are available to support other agency work — including the very programs and restoration projects that reduce the fire threat…
Data from the National Interagency Fire Center provides some insight into the troubling trends happening in wildfire management. The following table includes data on the average frequency, size, and cost of wildfires over each of the past three decades:
Despite the frequency of fires actually falling over the past 30 years, the average size of those fires is increasing, as are the costs related to fighting them. Over that time period, the average yearly total of acres burned doubled, while the average annual suppression costs more than quadrupled.
So what is causing this rise in spending?
That is the million (or should I say $3 billion) dollar question. In my next piece for The Benchmark next month, I’ll explore some of the issues contributing to these ballooning costs, including how environmental regulation is actually hurting prevention efforts and how increased urban development in traditional fire-prone landscapes is driving up suppression costs. In each of these instances, climate change may be exacerbating the scope and severity of wildfires, but better approaches to forest management will be the key to stopping the rising cost of wildfire disasters.