The National Interagency Fire Center (NIFC) has been the keeper of U.S. wildfire data for decades, tracking both the number of wildfires and acreage burned all the way back to 1926.
After making the entire dataset public for decades, now, in a blatant act of cherry-picking, NIFC “disappeared” a portion of it, and only shows data from 1983. You can see it here.
Fortunately, the Internet never forgets, and the entire dataset is preserved on the Internet Wayback Machine.
Data prior to 1983 shows that U.S. wildfires were far worse 100 years ago, both in frequency and total acreage burned, than they are now, 100 years of modest warming later.
By disappearing all data prior to 1983, which just happens to be the lowest point in the dataset for the number of fires, NIFC data now show a positive slope of worsening wildfire aligning with increased global temperature.
This truncated data set is perfect for claiming “climate change is making wildfire worse,” but flawed because it lacks the context of the full data set.
See figure 1 below for a before and after comparison of what the NIFC data looks like when you plot it.
The full data set shows wildfires were far worse in the past.
In June 2011, when this data was first made publicly available by the NIFC, the agency said,
“Figures prior to 1983 may be revised as NICC verifies historical data.”
In December 2017, I published an article titled: “Is climate change REALLY the culprit causing California’s wildfires?, pointing out the federal government’s own data showed wildfires had declined significantly since the early 1900s, which undermined claims being made by the media that climate change was making wildfires more frequent and severe.
Curiously, between January 14 and March 7 of 2018, shortly after this article appeared, NFIC added a new caveat on its data page stating:
The National Interagency Coordination Center at NIFC compiles annual wildland fire statistics for federal and state agencies. This information is provided through Situation Reports, which have been in use for several decades.
Prior to 1983, sources of these figures are not known, or cannot be confirmed, and were not derived from the current situation reporting process. As a result, the figures prior to 1983 should not be compared to later data.
With the Biden administration now in control of NIFC, the agency now says,
“Prior to 1983, the federal wildland fire agencies did not track official wildfire data using current reporting processes. As a result, there is no official data prior to 1983 posted on this site.”
This attempt to rewrite the official United States fire history for political reasons is both wrong and unscientific. NFIC never previously expressed concern its historical data might be invalid, or shouldn’t be used.
NFIC’s data has been relied upon by peer-reviewed research papers and news outlets in the United States for decades. Without this data, there is no scale of the severity of the wildfires or method to compare the number of wildfires in the past with the numbers today.
Wildfire data is fairly simple to record and compile: count of the number of fires and the number of acres burned.
NFIC’s revision of wildfire history is essentially labeling every firefighter, every fire captain, every forester, and every smoke jumper who has fought wildfires for decades as being untrustworthy in their assessment and measurement of this critical, yet very simple fire data.
The reason for NFIC erasing wildfire data before 1983 is not transparent at all. NFIC cites no study, provides no scientifically sound methodological reason to not trust the historic data that previously the agency publicized and referenced.
Indeed, NFIC provides no rationale at all for removing the historic data justifying any claim that it was flawed or incorrect.
If the fact that scientists or bureaucrats have changed the way they track and calculate data over time legitimately justifies throwing out or dismissing every bit of evidence gathered before contemporary processes were followed, there would be no justification for citing past data on temperatures, floods, droughts, hurricanes or demographics, and economic data.
The way all of these and other “official” records have been recorded has changed dramatically over time.
Even the way basic temperature is recorded is constantly evolving, from changes in where temperatures are recorded to how they have been recorded: from a few land-based measuring stations and ocean-going ship measurements to weather balloons, satellites, and geostationary ocean buoys.
If simply changing the way a class of data is recorded justifies jettisoning all historic datasets, then no one can say with certainty that temperatures have changed over time and a human fingerprint of warming has been detected.
Plotting the entire NIFC dataset (before it was partially disappeared) demonstrates that wildfire and weather patterns have been inextricably linked for decades.
Note figure 2 below, combining the number of fires and number of acres burned. See the annotations that I have added.
The NIFC decision to declare data prior to 1983 “unreliable” and removing it is not just hiding important fire history but cherry-picking a data-starting point that is the lowest in the entire record to ensure that an upwards trend exists from that point.
The definition of cherry-picking is:
Cherry-picking, suppressing evidence, or the fallacy of incomplete evidence is the act of pointing to individual cases or data that seem to confirm a particular position while ignoring a significant portion of related and similar cases or data that may contradict that position.
It seems NIFC has caved to political pressure to disappear inconvenient wildfire data. This action is unscientific, dishonest, and possibly fraudulent. NIFC is no longer trustworthy as a source of reliable information on wildfires.
Read more at Climate Realism
The facts was that back in the 1970’s it was Global Cooling and New Ice Age was coming there is a episode of TV series from 1978 In Search Of which has all about the Coming Ice Age has about the hard winters of 1976/77 and Buffalo N.Y. getting hit with Record Snow Falls
Cherry picking data is a standard mode of operation of the climate change movement. Consider ocean “acidification.” The year 1988 was chosen at the base line because the pH was at high point. This guaranteed that by comparison other years would have a lower pH, more towards the acid side. Then the fraudsters replaced the data before 1988 with simulated data. They had to because the real data showed that there was no long term trend of the ocean’s pH going down. Often times hiding data isn’t enough. The official temperature data before 1950 was changed to be colder than it really was, and the data after 1950 was made warmer in order to support the claims of global warming. The winter of 2017 was extra cold, so the official records were changed to make it look like an ordinary year.
These acts of fraud represent some of the most compelling evidence against the global warming/climate change narrative. The people changing the data obviously know that the unaltered data doesn’t support the political movement.
Lets start putting the Eco-Freaks on these Fires i mean if those Nit-Wits were so concerned about t his Global Warming/Climate Change scam they should think(if they still can)about all the Green Houses gasses produced by Forests/Wildfires
Try putting a bushfire out without fossil fuels…
Ditto. Glad to see more detailed exposure to government climate data fraud, manipulation, and blatant hypocrisy (to name just a few). My short video catches Joe red-handed is data fraud. Geez, Joe can’t even intelligently lie about the climate … https://newtube.app/user/RAOB/KX3Jgsm