There are at least seven major mistakes found in climate models. Five of them come from oversimplified ideas about how energy moves. Another one involves errors that build up when solving complex equations about how air and water behave.
The last mistake is assuming that the global average temperature is a good measure of climate change.
I detail these errors in my paper “A Nobel Prize for Climate Modeling Errors”, Science of Climate Change 4(1) pp. 1-73 (2024). References to specific sections of this paper, identified as C24, are provided in the discussion.
The first three errors in simplifying energy transfer come from the 1967 paper by Manabe and Wetherald, titled “Thermal equilibrium of the atmosphere with a given distribution of relative humidity” [MW67].
In this paper, they used a one-dimensional radiative-convective (1-D RC) model with a steady-state air column and a fixed relative humidity distribution (see C24, Section 3.1).
1) The steady-state assumption artificially created warming in the calculation as the CO2 levels increased.
2) The fixed relative humidity assumption led to a ‘water vapor feedback’ that amplified this initial warming artifact.
3) Their time integration method lets tiny temperature increases add up over time, which wouldn’t happen in the real atmosphere, as these small changes are overshadowed by normal daily and seasonal temperature variations.
When MW67 doubled the CO2 concentration from 300 to 600 ppm, the modeling errors led to an increase in surface temperature of 2.9°C under clear skies and 2.4°C with average cloud cover. This mistake laid the groundwork for the massive climate fraud we see today.
As global warming claims became a lucrative source of research funding, there was no turning back. From that point on, every climate model had to show similar warming for a ‘CO2 doubling.’ Today, this is known as Equilibrium Climate Sensitivity (ECS).
Manabe and Wetherald spent the next eight years integrating their 1967 model errors into a ‘highly simplified’ general circulation model (GCM) [MW75]. This new model had an ECS of 2.9°C, compared to the 2.4°C ECS with average clouds in MW67. MW75 set the warming benchmark for future climate GCMs.
Manabe and Stouffer introduced the fourth error in 1979 with their slab ocean algorithm. They overlooked how surface energy transfers, which led their model ocean to warm up from increased atmospheric CO2. When they quadrupled CO2 levels from 300 to 1,200 ppm, it caused a surface warming artifact of 4.1°C.
The larger and variable heat transfer from wind-driven evaporation overshadowed the long-wave infrared (LWIR) radiation, which penetrates less than 100 microns into the ocean surface. The small increase in downward LWIR flux from CO2 wasn’t enough to significantly raise ocean surface temperatures (see C24, Section 4.3).
Outside events also began to influence the development of the climate models. As the Apollo (moon landing) program was ending in 1972, there was ‘mission creep’. The planetary atmospheres group at NASA, including a young James Hansen, was told to switch to Earth applications (Hansen et al, 2000 (Chapter 4)).
In 1976 they copied the MW67 1-D RC model for ‘average cloudiness’ and created warming artifacts for ten minor species including methane (CH4) and nitrous oxide (N2O) (Wang et al, 1976). Then in 1981, they added a two-layer slab ocean model to their version of MW67 (Hansen et al, 1981 (H81)).
Using available weather stations and related data, they created a global mean temperature record. However, they ignored the obvious peak near 1940 in their temperature record that was caused by the warming phase of the Atlantic Multidecadal Oscillation (AMO).
By adjusting the concentration of CO2, the solar intensity, and the effect of volcanic aerosols over time, they were able to ‘tune’ their modified 1-D RC model to give an output similar to the global mean temperature record.
H81 provided the foundation for the pseudoscience of radiative forcings, feedback, and climate sensitivity still found in the climate models today. This is illustrated in Figure 1 (See C24, Sections 3.3 and 4.5).
H81 assumed that the temperature changes created by the mathematical artifacts in a 1-D RC type of climate model were real and that a contrived fit to a global mean temperature record was somehow a measure of climate change.
Temperature is an intensive thermodynamic property, and temperature averages over independent thermodynamic systems are just numbers with no physical meaning (Essex et al, 2006). Climate change should be defined in terms of changes to climate zone boundaries, such as those used in the Köppen or related classifications (Kottek et al, 2006) (See C24, Section 5).
As funding for nuclear programs dropped, mission creep spread to the Atomic Energy Commission, which had joined the US Department of Energy (DOE) in 1977. The DOE’s climate research program has focused on model intercomparison, where researchers compare results from flawed models without considering physical reality.
The Coupled Model Intercomparison Program (CMIP) began in 1996 (Meehl et al, 1997). Various phases of this program have provided much of the questionable climate model results used by the UN Intergovernmental Panel on Climate Change (IPCC) (Stouffer et al, 2017, Hausfather, 2019).
As computer technology has improved, climate models have become more complex. Researchers have added more forcing agents and adjusted changes over time to make the global average temperature record generated by the models match the one from weather station data.
Early 1-D models were replaced by atmospheric GCMs, which were then succeeded by coupled ocean-atmospheric GCMs. Today, researchers are developing Exascale models that combine emission scenarios with GCMs. However, all these large-scale models are built on the flawed foundation established by MW67 and H81.
The steady-state assumption in the 1-D models was replaced with a fictional global energy balance. The new assumption was that greenhouse gas radiative forcing or a decrease in outgoing long-wave IR radiation (OLR) at the top of the atmosphere (TOA) altered the Earth’s energy balance.
According to this assumption, the surface temperature would rise until the flux balance at TOA was restored (Knutti and Hegerl, 2008). Instead, the small amount of extra heat released in the troposphere gets radiated back to space as wideband LWIR emission, causing almost no change to the Earth’s energy balance (See C24, Section 4.1).
In addition, the GCMs require the solution to very large numbers of coupled non-linear equations. The errors associated with the solution of these equations increase with time and can become unstable. Weather forecasting models are limited to predictions for about 12 days ahead. The climate GCMs have no predictive capabilities over the time scales required for climate studies [Lorenz, 1963; 1973].
The fifth error appeared in the Third IPCC Climate Assessment Report (TAR) in 2001. The report manipulated a time series of radiative forcings to create the illusion that it fit the global mean temperature record.
It divided these forcings into ‘natural’ and ‘anthropogenic’ contributions, as shown in Figure 2a. Researchers then reran the climate models to generate separate ‘natural baseline’ and ‘anthropogenic contribution’ results, illustrated in Figures 3b, 3c, and 3d.
A vague statistical argument, based on changes to the normal distribution of temperature, linked ‘anthropogenic’ forcings to increased frequency and intensity of ‘extreme weather events,’ as shown in Figure 2e. Figure 2a is from Tett et al, 2000. Figures 2b through 2e are from the TAR (AR3 is used instead of TAR in Figure 3). The report ignored the obvious 1940s peak related to the AMO’s warming phases.
This ‘extreme weather attribution’ provided pseudoscientific justification for the political control of fossil fuel combustion, leading to the problematic net zero policy we face today (See C24, Section 3.5).
Various political and environmental groups began to exploit the warming artifacts created by climate models to advance their agendas. A key event was the 1975 conference ‘The Atmosphere Endangered and Endangering,’ organized by anthropologist Margaret Mead (Hecht, 2007). She aimed to use atmospheric pollution—whether real or imagined—as a means of population control.
The IPCC was established in 1988, and the US Global Change Research Program (USGCRP) was created by presidential initiative in 1989 and mandated by Congress in 1990. The Hadley Climate Centre in the UK was also founded in 1990.
The IPCC’s mission is ‘to assess the scientific, technical and socioeconomic information relevant for the atmospheric understanding of the risk of human-induced climate change.’ This mission rests on the assumption that human activities are driving CO2-induced global warming.
The USGCRP aims ‘to coordinate federal research and investments in understanding the forces shaping the global environment, both human and natural, and their impacts on society.’ However, it has failed to fulfill this mission by overlooking the natural causes of climate change.
Floods, droughts, wildfires, and other ‘extreme weather events’ result from natural weather conditions, such as downslope winds, high-pressure domes, and ocean oscillations, which are unaffected by increased greenhouse gas concentrations (See C24 Sections 4.6 and 5.1).
The USGCRP has uncritically accepted the pseudoscientific climate modeling methods used by the IPCC without making any effort at independent model validation.
The Fifth National Climate Assessment Report (NCA5), published by the USGCRP in 2023, showcases the pseudoscientific basis of its findings.
Figures 3.1, 3.2, and 3.3 in Chapter 3, Earth System Processes, illustrate the radiative forcings, feedbacks, and climate sensitivities used in the climate models that falsely justify the extreme weather claims presented in NCA5. These figures, adapted from the IPCC Sixth Climate Assessment Report (AR6, 2021), appear as Figures 3a, 3b, and 3c.
Figure 3a traces the ‘climate drivers’ back to the mathematical artifacts in early climate models. MW67 initiated the CO2 warming concept. Wang et al (1976) introduced warming effects from CH4, N2O, and halogenated gases. H81 used volcanic aerosols for ‘fine-tuning.’
The Third IPCC Climate Assessment Report (2001) emphasized human influence and its invalid link to extreme weather. MW67 also introduced the water vapor feedback shown in Figure 3b.
Figure 3c presents the climate sensitivity from the 1979 Charney report, which was based on early GCM work by Manabe’s group and unpublished results from Hansen’s group.
For 200 years, scientists have ignored Fourier’s work on the time delay between peak solar flux and the surface temperature response (Fourier, 1824). These daily and seasonal shifts clearly show a non-equilibrium thermal response to solar flux (Clark, 2023).
We need to view changes in flux as changes in the rate at which a thermal reservoir heats or cools. (Clark and Rörsch, 2023). Doubling CO2 concentration from 300 to 600 ppm causes a maximum increase in LWIR tropospheric cooling rate of +0.08 K per day (Iacono et al, 2008).
For a lapse rate of -6.5 K km-1, a +0.08 K change equals a 12-meter altitude decrease—about the same as riding an elevator down four floors (See C24, Section 4.2).
Currently, the average annual increase in atmospheric CO2 is about 2.4 ppm, leading to a 34 milliwatts per square meter increase in downward LWIR flux from the lower troposphere to the surface.
This change is too small to measure against the normal daily variations in temperature and humidity. It cannot affect extreme weather events, meaning there is no detectable ‘CO2 signal’ in the climate record (See CR24, Section 4.5).
The climate modeling community has failed to address and correct errors in their models. Climate modelers, who are more applied mathematicians and computer programmers than scientists, focus on solving complex equations and securing their paychecks.
They have ignored the fundamental errors in their models. The tiny changes in temperature and humidity predicted by these models cannot accumulate over time in the real world.
The supposed greenhouse gas-induced warming and water vapor feedback are mere mathematical artifacts from the models’ simplifying assumptions, established before any computer code was written.
Any climate model with an equilibrium climate sensitivity greater than ‘too small to measure’ is inherently fraudulent. No need for further investigation.
Climate modeling has shifted from scientific inquiry to what resembles an Imperial Cult of the Global Warming Apocalypse. Belief in the models’ results has replaced evidence-based reasoning. Climate modelers have become the prophets of this Cult.
Climate modeling fraud is clear in the literature. The issues are not scientific but need political and legal action to address the many problems.
Roy Clark is a retired engineer with over 40 years of experience in optics, spectroscopy, and new product development.
You left out the largest error, the one that underpins the entirety of the CAGW ‘industry’.
https://www.patriotaction.us/showthread.php?tid=2711
That error being a misuse of the Stefan-Boltzmann equation, using the idealized blackbody form of the equation in their Energy Balance Climate Models (EBCMs) upon real-world graybody objects.
The idealized blackbody form of the S-B equation assumes emission to 0 K and thus it artificially inflates radiant exitance of all calculated-upon objects. Thus the climatologists are forced to carry these incorrect values through and cancel them on the back end… subtracting a wholly-fictive ‘cooler to warmer’ energy flow from the real (but too high because it was calculated for emission to 0 K) ‘warmer to cooler’ energy flow.
It should be obvious that this conjures “backradiation” out of thin air. This “backradiation” is then claimed to cause their purported “greenhouse effect”.
The climatologists knew that “backradiation” violated 2LoT in the Clausius Statement sense, was thus physically impossible, and thus couldn’t show any effect… so they conflated their “greenhouse effect” with the Kelvin-Helmholtz Gravitational Auto-Compression Effect (aka the Adiabatic Lapse Rate).
You’ll note the lapse rate has nothing to do with any “greenhouse effect”, nor any “greenhouse gases”, and is instead the result of the conversion of translational mode (kinetic) energy in the z-axis DOF (Degree of Freedom) to gravitational potential energy with altitude (and vice versa). That change in z-axis DOF kinetic energy then equipartitions to the other 2 linearly-independent DOF upon subsequent collisions, per the Equipartition Theorem.
So “backradiation” is nothing more than a mathematical artifact due to the misuse of the S-B equation, the “greenhouse effect” doesn’t actually exist… what about “greenhouse gases”?
Well, given that the “greenhouse effect” was used to designate polyatomics as “greenhouse gases”, it stands to reason that “greenhouse gases” (in the climatologists’ “greenhouse effect due to backradiation” sense) don’t exist, either.
“Greenhouse gases” (in the strict ‘actual greenhouse’ sense) do exist, though… they’re the monoatomics, and to a lesser extent, the homonuclear diatomics. Both of which are not effective radiative emitters, radiative emission to space being the only way our planet can shed energy to cool. I explain this more fully at the link above.
In fact, while the climatologists claim water vapor is the most potent “greenhouse gas”, in reality, water is such a potent atmospheric radiative coolant that it drastically reduces the Dry Adiabatic Lapse Rate (which is ~9.81 K km-1)… the Humid Adiabatic Lapse Rate ranges from ~3.5 K km-1 (high humidity) to ~6.5 K km-1 (average humidity).
We know that the ‘effective emission height’ temperature is ~255 K and it is at ~5.105 km altitude.
3.5 K km-1 * 5.105 km = 17.8675 K + 255 K = 272.8675 K surface temperature (high humidity Humid Adiabatic Lapse Rate)
6.5 K km-1 * 5.105 km = 33.1825 K + 255 K = 288.1825 K surface temperature (average humidity Humid Adiabatic Lapse Rate)
9.81 K km-1 * 5.105 km = 50.08005 K + 255 K = 305.08005 K surface temperature (Dry Adiabatic Lapse Rate)
Notice that “33.1825 K” above for the “average humidity Humid Adiabatic Lapse Rate”? Yeah, that’s what the climatologists claim their “greenhouse effect” causes. Their “greenhouse effect” doesn’t cause it, the lapse rate does.
Note that water drastically cools the atmosphere… the ‘average humidity Humid Adiabatic Lapse Rate’ is the ‘Dry Adiabatic Lapse Rate’ minus the radiative cooling by water vapor… but remember that the climatologists claim that water vapor is the most potent “greenhouse gas”.
Water is not a “greenhouse gas”, it is such a potent radiative atmospheric coolant that not only does it drastically reduce the lapse rate, but it acts as a literal refrigerant (in the strict ‘refrigeration cycle’ sense) below the tropopause.
The refrigeration cycle (Earth) [AC system]:
A liquid evaporates at the heat source (the surface) [in the evaporator], it is transported (convected) [via an AC compressor], it gives up its energy to the heat sink and undergoes phase change (emits radiation in the upper atmosphere, the majority of which is upwelling owing to the energy density gradient and the mean free path length / altitude / air density relation) [in the condenser], it is transported (falls as rain or snow) [via that AC compressor], and the cycle repeats.
The climatologists have flipped thermodynamics on its head with their misuse of the S-B equation… they are as near to diametrically opposite to reality as they could possibly be. IOW, they are as wrong as they could possibly be.
Author Roy Clark is a fool who does not realize there are no real climate models. What are calle models are just computer games used as propaganda, programmed to scare people with predictions of potentially (in hundreds of years) dangerous global warming and blaming that warming almost entirely on humans.
The “confuser games” pretend to know the long term effects of CO2 emissions including feedbacks … when the scientists only agree upon mild warming effects (about +0.75 degrees C. per CO2 x 2 in from CO2 alone) measured in labs using infrared gas spectroscopy). Harmless.
The author goes into la la land near the end with a series of false conclusions NOT backed by data:
“The supposed greenhouse gas-induced warming and water vapor feedback are mere mathematical artifacts
CLARK
Nearly 100% of scientists since 1896 believe the greenhouse effect exists base on evidence
The author is ignorant
The water vapor positive feedback theory is suported by absolute humidity trends from 1980 to 2000, but not from 2000 o 2020. AH data are not very accurate so that feeback can not be measured. It is also offset partially or fully by at least two negative feedbacks resulting from a warmer surface:
(1) More evaporation
(2) More upwelling long wave radiation
Some scientists think (1) and (2) completely offset the water vapor positive feedback (WVPF). Others think the offsets has a time lag. The alarmists believe the WVPF is very strong. Obviously, no one knows.
(2) “Any climate model with an equilibrium climate sensitivity greater than ‘too small to measure’ is inherently fraudulent.” CLARK
Nearly 100% of scientists since 1896 disagree
The author is ignorant.
Here we have a non scientists author claiming he kows more than almst 100% of scientosts since 1896. That is silly science.
The climate models preicte an average of about 3 degrees C. pe CO2 x 2 while actual warming since 1975 was about +2.4 degrees C. per CO2 x 2. That was just a lucky guess bu a lucky guess makes refuting the models a difficult task.
The models / confuser games make long term gesses about CO2 effects that are not supporte by data.
Science requires data.
The only science we have, which has withstood a 127 year test of time since 1896, is the existence of a greenhouse effect and the fact that CO2 at the current 425 ppm is a weak greenhouse gas. I wonder if the author would even agree to that?
A judge in the UK Forced the makers of Gore the Bores A INCONVENT TRUTH to list the nine errors