Just about every projected environmental catastrophe going back to the population bomb of the late 1960s, the “Club of Rome” and “Global 2000” resource-exhaustion panics of the 1970s, the ozone depletion crisis of the 1980s, and beyond has depended on computer models, all of which turned out to be wrong, sometimes by an order of magnitude.
No putative environmental crisis has depended more on computer models than “climate change.” [bold, links added]
But in the age of high confidence in supercomputing and rapidly advancing “big data” analytics, computer climate models have arguably gone in reverse, generating a crisis in the climate-change community.
The defects of the computer climate models—more than 60 are used at the present time—that the whole climate crusade depends on have become openly acknowledged over the past few years, and a fresh study in the mainstream scientific literature recently highlights the problem afresh: too many of the climate models are “running hot,” which calls into question the accuracy of future temperature projections.
Nature magazine, one of the premier “mainstream” science journals, last week published “Climate simulations: recognize the ‘hot model’ problem,” by four scientists all firmly established within the “consensus” climate science community.
It is a carefully worded article, aiming to avoid giving ammunition to climate-change skeptics, while honestly acknowledging that the computer models have major problems that can lead to predictions of doom that lack sufficient evidence.
“Users beware: a subset of the newest generation of models are ‘too hot’ and project climate warming in response to carbon dioxide emissions that might be larger than that supported by other evidence,” the authors write.
While affirming the general message that human-caused climate change is a serious problem, the clear subtext is that climate scientists need to do better lest the climate science community surrenders its credibility.
One major anomaly of the climate modeling scene is that, as the authors write, “As models become more realistic, they are expected to converge.” But the opposite has happened—there is more divergence among the models.
Almost a quarter of recent computer climate models show much higher potential future temperatures than past model suites, and don’t match up with known climate history:
“Numerous studies have found that these high-sensitivity models do a poor job of reproducing historical temperatures over time and in simulating the climates of the distant past.”
What this means is that our uncertainty about the future climate is increasing. To paraphrase James Q. Wilson’s famous admonition to social scientists, never mind predicting the future; many climate models can’t even predict the past.
A quick primer: in general, the average of computer climate models predict that a doubling of the level of greenhouse gases (GHGs), principally carbon dioxide (CO2), by the end of this century would increase the global average temperature by a range of 1.5 degrees C to 4.5 degrees C.
At present rates of GHG emissions, we’re on course to double the GHG level in the atmosphere about 80-100 years from now.
Why is the range so wide, and why does it matter? First, the direct thermal effect of doubling GHGs is only about 1.1 degrees.
So how do so many models predict 4.5 degrees or more? Two words: feedback effects.
That is, changes in atmospheric water vapor (clouds, which both trap and reflect heat), wind patterns, ocean temperatures, shrinkage of ice caps at the poles, and other dynamic changes in ecosystems on a large scale.
Yet it is precisely these feedback effects where the computer models are the weakest and perform most poorly.
The huge uncertainties in the models (especially for the most important factor—clouds) are always candidly acknowledged in the voluminous technical reports the U.N.’s Intergovernmental Panel on Climate Change (IPCC) issues every few years, but few people—and no one in the media—bother to read the technical sections carefully.
Why are climate models so bad? And can we expect them to improve any time soon?
Steven Koonin, a former senior appointee in the Department of Energy in the Obama administration, explains the problem concisely in his recent book Unsettled: What Climate Science Tells Us, What It Doesn’t and Why It Matters.
The most fundamental problem with all climate models is their limited “resolution.” Climate models are surprisingly crude, as they divide up the atmosphere into 100 km x 100 km grids, which are then stacked like pancakes from the ground to the upper atmosphere.
Most climate models have one million atmospheric grid squares and as many as 100 million smaller (10 sq. km) grid squares for the ocean.
The models then attempt to simulate what happens within each grid square and sum the results. It can take up to two months for the fastest supercomputers to complete a model “run” based on the data assumptions input into the model.
The problem is that “many important [climate] phenomena occur on scales smaller than the 100 sq. km. (60 sq. miles) grid size, (such as mountains, clouds, and thunderstorms).”
In other words, the accuracy of the models is highly limited. Why can’t we scale down the model resolution? Koonin, who taught computational physics at Cal Tech, explains:
“A simulation that takes two months to run with 100 km grid squares would take more than a century if it instead used 10 km grid squares. The run time would remain at two months if we had a supercomputer one thousand times faster than today’s—a capability probably two or three decades in the future.”
But even if the models get better at the dynamics of what happens in the atmosphere on a more granular scale, the models still depend on future GHG emissions forecasts, and there is a wide range of emissions scenarios the modelers use.
The high-end temperature forecasts depend on extreme projections of future emissions that are no longer credible, such as one model included in previous U.N. reports that relied on a six-fold increase in the use of coal over the next 80 years, an outcome no one thinks is going to happen (or only with massive carbon-capture technology if it does).
Emissions forecasts made just 20 years ago turned out to be much too high for today. Nearly all of the most alarming claims of the effects of future warming depend on these discredited forecasts, but the media has failed to keep up with the changing estimates. It’s a classic garbage-in, garbage-out problem.
The Nature article is candid about this problem:
The largest source of uncertainty in global temperatures 50 or 100 years from now is the volume of future greenhouse-gas emissions, which are largely under human control. However, even if we knew precisely what that volume would be, we would still not know exactly how warm the planet would get.
The authors of the Nature article are taking a risk in dissenting from the politicized party line on climate science, however cautiously worded, and deserve credit for their candor and self-criticism of climate modeling.
Read more at The Pipeline
Micheal Mann in the Penalty Box for Political use of a Hockey Stick
Since the UN IPCC itself stated that climate is a coupled non linear chaotic system, therefore making longer term forecasts impossible, why do they continue to say they can instead rely on the very system they say makes such forecasts impossible?.
Roger, the answer to your question is easy–they got to continue to beat the drums of climate catastrophe to keep the money flowing. This isn’t about “climate change” caused by humans but about global governance and total control over we peons.
Time to dump Manns Hockey Stick and fine him for his faking the data to get when he wants and thats More Government(Taxpayers)Money
One of the basic principles of true science is if a theory doesn’t match the data, scrap it or modify it until it does. On that basis, the climate models should have been scrapped or brought into line with reality over 15 years ago.
From the article, “The most fundamental problem with all climate models is their limited resolution.” In reality, the fundamental problem is that the climate models are specifically designed to support the climate change political agendas. As such, they have to run hot. From the article, “And can we expect them to improve any time soon?” The answer is no, because improvement would weaken the support for action on climate change.
From the article, “but the media has failed to keep up with the changing estimates.” The media is on a propaganda campaign for increasing actions on climate change. They certainly are not going to keep up on changing estimates that are less alarming.