There are different methods for measuring global temperatures. The satellite record as compiled by meteorologists from the University of Alabama, Huntsville (UAH), is used by the United Nation’s Intergovernmental Panel on Climate Change. This is the temperature time series most often quoted by those sceptics of anthropogenic global warming.
Republican Senator Ted Cruz made much of the 18-year long pause in this record of his cross-examination of Sierra Club President Aaron Mair at a US Senate subcommittee meeting late last year.
Mr Mair, like most climate justice activists unfamiliar with this evidence, had no idea that this record showed no warming since the super-El Nino of 1997-1998. Rather than provide an explanation of the trends in any of the global temperature datasets, Mr Mair could only explain that global warming was real, because there was a scientific consensus that global warming was real. Of course a “consensus” is a form of politics, and while it can deny a fact, it can’t actually change one. Whether or not there is a trend in a series of numbers is determined by statistics, not consensus or opinion.
Until this year, there was no warming trend in the UAH satellite data. Then the February 2016 update to the database showed a surge in global temperatures, particularly in the northern hemisphere. This has been attributed to an El Nino event, which is exactly the same phenomena that caused the surge in temperatures in 1997-1998. So, the pause has been broken, and the cause is not carbon dioxide.
As a colleague emailed me, “The extent of the observed increase in global temperatures is out of all proportion to the increase in carbon dioxide for the same period.” That’s correct. El Nino events are not caused by carbon dioxide. They are natural events which manifest as changes in ocean and atmospheric circulation patterns across the Pacific Ocean. An El Nino typically begins with a weakening of the trade winds and is measured by a fall in air pressure over Tahiti, and a rise in surface pressure over northern Australia, measured at Darwin.
While the UAH satellite data only goes back to 1979, there is a record of changes in air pressure at Darwin back to 1876. This time series indicates super El Ninos occurred back in 1914/1915 and also 1940/1941. Considering the surface temperature records as measured with mercury thermometers for some locations in eastern and northern Australia, temperatures back in 1914/1915 and 1940/1941 were hotter than they are now.
So, while the global satellite temperature data indicates that February 2016 is the hottest month on record, this pertains to a record that only goes back to 1979. If we consider the much longer surface temperature record for many individual locations across Australia and other parts of the world, February 2016 is not that hot.
But this is an exceedingly contentious claim, rejected by a “consensus” of climate scientists that rely exclusively on homogenized temperature series. That is the early temperature records are almost all adjusted down through the homogenization process, so the present appears warmer relative to the past. It is not contested that the official surface temperature records are adjusted, or that the early records are generally cooled. This is justified mostly on the basis of “world’s best practice” and that temperature series should show global warming from at least 1910. I’ve explained the inconsistencies in the adjustment made in the homogenization of the original observed maximum temperatures at Darwin in a technical paper originally accepted for publication in the International Journal of Climatology (see postscript for more information).
Back in 1876, the weather station at Darwin was the responsibility of Charles Todd, Australia’s Postmaster General and Superintendent of Telegraphs. His priority until 1872 had been the construction of an overland telegraphic line from Adelaide to Darwin along which he established 14 regional weather stations. Todd’s first passion was meteorology and astronomy, both of which he used in his weather forecasting.
Air pressure measurements were important to the weather forecasts being issued by Todd, with these the measurements standardized based on local temperature measurements. It was thus important that local temperatures were accurately recorded. While these records stood for over 100 years, beginning in 1996 the Australian Bureau of Meteorology started “adjusting” all the old temperature records used in the calculation of official temperature trends, including the maximum temperature series for Darwin.
What the homogenization process tends to do to the temperature record is not only create a global warming trend where none previously existed, but it also removes the natural cooling and warming cycles so evident in the raw observational data. For example, in the unadjusted maximum temperatures as recorded at Darwin the hottest year is 1907. Temperatures then appeared to cool to 1942 when there is a spike in temperatures. Note that 1941/1942, like 1997/98 and 2015/2016 were El Nino years. These were also years of minimum lunar declination.
Unlike modern meteorologists, Todd understood that climate change on Earth is driven by extraterrestrial phenomena. But he would likely have cautioned against single-cause explanations recognizing that there are multiple and overlapping periodicities evident in the history of the Earth’s climate. There are natural cycles that spans tens of thousands of years affected by changes in the Earth’s tilt, and much shorter cycles affected by changes in solar activity. Early 20th Century astronomers and weather forecasters, particularly Inigo Owen Jones, where interested in the planets. They noted decades in advance that 1940/41 would have been a year of Jupiter, Saturn and Uranus conjunction.
Todd would have outlawed the practice of homogenization. Scientists of that era considered the integrity of observational records sacrosanct.
At an online thread I recently read the following comment about homogenization:
Don’t you love the word homogenise? When I was working in the dairy industry we used to have a homogeniser. This was a device for forcing the fat back into the milk. What it did was use a very high pressure to compress and punish the fat until it became part of the milk. No fat was allowed to remain on the top of the milk it all had to be to same consistency… Force the data under pressure to conform to what is required. Torture the data if necessary until it complies…
Clearly the Bureau’s remodeling of historical temperature data is unnatural and unscientific. In erasing the natural climate cycles and generating a global warming trends, the capacity of modern climate scientists to forecast spikes in global warming is greatly diminished, as is their capacity to forecast droughts and floods.
Because of the homogenization of the surface temperature record in the compilation of national and global climate statistics, those skeptical of anthropogenic global warming, have long preferred the UAH satellite record. Even though this record only begins in 1979.
The UAH global temperature record for the lower troposphere which once showed no trend for 18 years, now shows a surge in warming. This warming, however, is neither catastrophic nor outside the bounds of natural variability. And it certainly hasn’t been caused by carbon dioxide.