More on the Bombshell David Rose Article: Instability in the Global Historical Climate Network

(h/t WUWT?) Yesterday there was an article in the Mail by David Rose, regarding manipulation and adjustment of temperature data. This issue comes up fairly regularly, but what’s new is that the source of the information this time is a “whistleblower”, John Bates, a highly regarded climate scientist who actually worked at NOAA until retiring last year.

Bates has a substantial and technical article at Climate etc.

The purpose of this post is to confirm one detail of Bates’s complaint. The Mail article says that

“The land temperature dataset used by the study was afflicted by devastating bugs in its software that rendered its findings ‘unstable’.” and later on in the article, “Moreover, the GHCN software was afflicted by serious bugs. They caused it to become so ‘unstable’ that every time the raw temperature readings were run through the computer, it gave different results.”

Bates is quite correct about this. I first noticed the instability of the GHCN (Global Historical Climatology Network) adjustment algorithm in 2012. Paul Homewood at his blog has been querying the adjustments for many years, particularly in Iceland, see here, here, here and here for example. Often, these adjustments cool the past to make warming appear greater than it is in the raw data.

When looking at the adjustments made for Alice Springs in Australia, I noticed (see my comment in this post in 2012) that the adjustments made to past temperatures changed, often quite dramatically, every few weeks. I think Paul Homewood also commented on this himself somewhere at his blog.

When we first observed these changes, we thought that perhaps the algorithm itself had been changed. But it became clear that the adjustments were changing so often, that this couldn’t be the case, and it was the algorithm itself that was unstable.

In other words, when new data was added to the system every week or so and the algorithm was re-run, the resulting past temperatures came out quite differently each time.

Here is a graph that I produced at the time, using data that can be downloaded from the GHCN ftp site (the unadjusted and adjusted files are ghcnm.tavg.latest.qcu.tar.gz and ghcnm.tavg.latest.qca.tar.gz respectively) illustrating the instability of the adjustment algorithm:

The dark blue line shows the raw, unadjusted temperature record for Alice Springs. The green line shows the adjusted data as reported by GHCN in January 2012. You can see that the adjustments are quite small. The red line shows the adjusted temperature after being put the through the GHCN  algorithm, as reported by GHCN in March 2012.

In this case, past temperatures have been cooled by about 2 degrees. In May, the adjustment algorithm actually warmed the past, leading to adjusted past temperatures that were about three degrees warmer than what they had reported in March!

Note that all the graphs converge together at the right hand end, since the adjustment algorithm starts from the present and works backwards. The divergence of the lines as they go back in time illustrates the instability.

There is a blog post by Peter O’Neill, Wanderings of a Marseille January 1978 temperature, according to GHCN-M, showing the same instability of the algorithm.  He looks at adjusted temperatures in Marseille, that illustrate the same  apparently random jumping around, although the amplitude of the instability is a bit lower than the Alice Springs case shown here.

His post also shows that more recent versions of the GHCN code have not resolved the problem, as his graphs go up to 2016. You can find several similar posts at his blog.

There is a lot more to be said about the temperature adjustments, but I’ll keep this post fixed on this one point. The GHCN adjustment algorithm is unstable, as stated by Bates, and produces virtually meaningless adjusted past temperature data.  The graphs shown here and by Peter O’Neill show this.

No serious scientist should make use of such an unstable algorithm. Note that this spurious adjusted data is then used as the input for widely reported data sets such as GISTEMP. Even more absurdly, GISS carry out a further adjustment themselves on the already adjusted GHCN data. It is inconceivable that the climate scientists at GHCN and GISS are unaware of this serious problem with their methods.

Finally, I just downloaded the latest raw and adjusted temperature datasets from GHCN as of Feb 5 2017. Here are the plots for Alice Springs. There are no prizes for guessing which is raw and which is adjusted.

You can see a very similar graph at GISS.

Source

Comments (1)

  • Avatar

    aido

    |

    WRITTEN BY JOHN O’SULLIVAN, PSI ON NOVEMBER 2, 2016. POSTED IN LATEST NEWS
    Breaking: 1920’s Brit ‘fatally infected’ All Government Climate Models
    A sensational new study shows western government climate models rely on a fatally flawed 1920’s algorithm. Scientists say this could be the breakthrough that explains why modern computers are so awful at predicting climate change: simulations “violate several known Laws of Thermodynamics.”
    British climate researcher Derek Alker presents an extraordinary new paper ‘Greenhouse Effect Theory within the UN IPCC Computer Climate Models – Is It A Sound Basis?’ exposing previously undetected errors that government climate researchers have unknowingly fed into multi-million dollar climate computers since the 1940s. [1]
    Alker explains:
    “This paper examines what was originally calculated as the greenhouse effect theory by Lewis Fry Richardson, the brilliant English mathematician, physicist and meteorologist.
    In 1922 Richardson devised an innovative set of differential equations. His ingenious method is still used today in climate models. But unbeknown to Richardson he had inadvertently relied upon unchecked (and fatally flawed) numbers supplied by another well-known British scientist, W. H. Dines.”
    Unfortunately, for Richardson Dines wrongly factored in that earth’s climate is driven by terrestrial (ground) radiation as the only energy source, not the sun. Richardson had taken the Dines numbers on face value and did not detect the error when combining the Dines numbers to his own. Alker continues: “The archives show Richardson never double-checked the Dines work (see below) and the records do not show that anyone else has ever exposed it.”
    The outcome, says Alker, is that not only has the original Richardson & Charney computer model been corrupted – but all other computer climate models since. All government researchers use these core numbers and believe them to be valid even though what they seek to represent can be shown today as physically impossible.

    Alker adds:
    “My paper specifically describes how the theory Dines calculated in his paper violates several of the known Laws of Thermodynamics, and therefore does not describe reality.
    The greenhouse effect theory we know of today is based on what Richardson had formulated from the Dines paper using unphysical numbers created by Dines. But Dines himself later suggested his numbers were probably unreliable.”
    Unfortunately, Dines died in the mid-1920’s and did not inform Richardson about the error. Thereupon, in the late 1940’s, Richardson began working with another world figure in climate science – Jule Charney as the duo constructed the first world’s first computer climate model. It was then the dodgy Dines numbers infected the works.
    Alker, who studied the archives scrupulously for his research reports that there is no published evidence that Richardson understood Dines’s calculation method. And we think he and Charney put the Dines numbers into the world’s first computer model verbatim.
    In essence, the ‘theory’ of greenhouse gas warming from the Dines numbers can be shown to start with a misapplication of Planck’s Law, which generates grossly exaggerated ‘up’ and none existent ‘down’ radiative emissions figures. Then, layer by layer, part of the downward radiation is added to the layer below, which is in violation of the Second Law of Thermodynamics.
    Thereby, like a domino effect, this bogus calculation method becomes GIGO (“garbage in, garbage out”) to all computers that run the program. Alker adds:
    “What the climate simulations are doing is creating energy layer by layer in the atmosphere that shouldn’t be there (it has no other source than of itself). It is then destroyed layer by layer (it is absorbed and then discarded – in effect destroyed). This is all presented in such a way to give the appearance that energy is being conserved, when it is not being conserved,”

Comments are closed