Two European professors recently wrote that the IPCC projections of future warming are based on huge unknowns, and do not take the past properly into account.
This means that projections of the future of the world’s climate are unreliable, according to Samuel Furfari, Professor at the Free University of Brussels,
and Henri Masson, Professor (Emeritus), University of Antwerpen.
According to the two professors, the GISS Surface Temperature Analysis (GISTEMP v4) is an estimate of global surface temperature change (one that is often used by climate scientists for their reports to the media).
This estimate is computed using data files from NOAA GHCN v4 (meteorological stations), and ERSST v5 (ocean areas).
In June 2019, the number of terrestrial stations was 8,781 in the GHCNv4 unadjusted dataset; but in June 1880, that figure was a mere 281 stations, the two professors write.
Scientists ignoring climate’s cyclic features
Professors Furfari and Masson write: “The climate system, and the way IPCC represents it, is highly sensitive to tiny changes in the value of parameters or initial conditions and these must be known with high accuracy. But this is not the case.”
In other words, the IPCC method is fraught with great uncertainties and much guesswork.
“This puts serious doubt on whatever conclusion that could be drawn from model projections,” the two professors write.
Open door to “fake conclusions”… “manipulations”
Masson and Furfari say that IPCC scientists ignore that climate change occurs with cyclic behavior and that linear trend lines applied to (poly-)cyclic data of period similar to the length of the time window considered, open the door to any kind of fake conclusions, if not manipulations aimed to push one political agenda or another.
Sea surface temperature incomplete
Other factors also cast doubt over the reliability of climate models, among them the lack of data on sea surface temperatures (SSTs). The oceans represent about 70% of the Earth surface.
“Until very recently, these temperatures have been only scarcely reported, as the data for SST (Sea Surface Temperature) came from vessels following a limited number of commercial routes,” report Masson and Furfari.
Such gaping data holes make it impossible to correctly calibrate models, and so that they are able to project the future climate. Scientists in many cases are simply free to guess whatever they want.
Masson’s and Furfari’s conclusions follow:
- IPCC projections result from mathematical models which need to be calibrated by making use of data from the past. The accuracy of the calibration data is of paramount importance, as the climate system is highly non-linear, and this is also the case for the (Navier-Stokes) equations and (Runge-Kutta integration) algorithms used in the IPCC computer models. Consequently, the system and also the way IPCC represent it, are highly sensitive to tiny changes in the value of parameters or initial conditions (the calibration data in the present case), that must be known with high accuracy. This is not the case, putting serious doubt on whatever conclusion that could be drawn from model projections.
- Most of the mainstream climate-related data used by IPCC are indeed generated from meteo data collected at land meteo stations. This has two consequences:
(i) The spatial coverage of the data is highly questionable, as the temperature over the oceans, representing 70% of the Earth surface, is mostly neglected or “guestimated” by interpolation;
(ii) The number and location of these land meteo stations have considerably changed over time, inducing biases and fake trends. - The key indicator used by IPCC is the global temperature anomaly, obtained by spatially averaging, as well as possible, local anomalies. Local anomalies are the comparison of present local temperature to the averaged local temperature calculated over a previously fixed reference period of 30 years, changing every 30 years (1930-1960, 1960-1990, etc.). The concept of a local anomaly is highly questionable, due to the presence of poly-cyclic components in the temperature data, inducing considerable biases and false trends when the “measurement window” is shorter than at least 6 times the longest period detectable in the data; which is, unfortunately, the case with temperature data
- Linear trend lines applied to (poly-)cyclic data of period similar to the length of the time window considered, open the door to any kind of fake conclusions, if not manipulations aimed to push one political agenda or another.
- Consequently, it is highly recommended to abandon the concept of global temperature anomaly and to focus on unbiased local meteo data to detect an eventual change in the local climate, which is a physically meaningful concept, and which is after all what is really of importance for local people, agriculture, industry, services, business, health and welfare in general.
=================================
Dr. Samuel Furfari is a recognized authority on energy policy based in Brussels. Between 1982 to 2018, he was a senior official on energy policy in the European Commission. For 15 years he is a professor of energy geopolitics and energy politics at the Free University of Brussels. He is the president of the European Society of Engineers and Industrialists.
Read more at No Tricks Zone
Having read this I thought it was rather enlightening.
I appreciate you taking the time and effort to put this short article together.
I once again find myself spending a significant amount
of time both reading and leaving comments. But so what,
it was still worthwhile!
Climate Models are only as honest as those who designs them and when they get goverment grants then they will make up and fake the models as long as the money pours in they will continue to produce fake models