When the media claims we’re experiencing a “once in 500-year” or “once in 1,000-year” weather event, they’re missing a fundamental point about how data works, and as an Earth Scientist, this is something I’m acutely aware of. [emphasis, links added]
In geology, we study the Earth’s long history through rock formations, sediment layers, and fossil records, which help us track major climatic trends and shifts over Earth’s history.
We can read signs of past floods, droughts, and shifts in temperature, but there’s a crucial limitation here: geological proxies don’t capture daily weather extremes.
You might find evidence of sustained climatic conditions that result in long-term sediment buildup or erosion.
For instance, we can see signs of long-term droughts or periods of significant rainfall across millennia, but you’re not going to find a fossil or rock layer that tells you, “Oh, it rained 10 inches in 24 hours on this particular day 2,000 years ago.” That’s just not how proxies work.
Proxies give us broad trends over long periods of time, not the kind of hyper-detailed weather data required to make statements about rare events like a “1,000-year storm.”
The truth is, these assertions are built on shaky statistical ground, and there’s no real way we can be certain about the frequency of these events given how little data we actually have.
The Statistical Absurdity of Rare Event Claims
When we talk about events with such extreme rarity, such as a “once in 1000-year” flood, we’re referring to statistical probability based on a distribution of observed events over time.
The tail end of any statistical distribution, especially one that measures the frequency of rare, extreme events is always the hardest to fill in. Simply put, the more extreme the event, the less data we have to make reliable predictions about how often it occurs.
So when the MSM declares that a particular weather event falls into the “500-year” or “1,000-year” category, it’s often based on incomplete data, assumptions, and models that are far from definitive.
In my previous article, I highlight the problem of relying on limited datasets when making sweeping claims about extreme weather events.
I point out, that the high side tail takes the longest to fill in, because the extremes are, by definition, rare. This means that the further out we go on the distribution, the more speculative the claims become about the frequency of such events
The High-Side Tail Takes the Longest to Fill In
Here’s an analogy that might help: imagine you’re filling a jar with marbles, but some marbles are much rarer than others. Let’s say most of the marbles are white, but there are a few rare blue ones mixed in.
You’ve been scooping marbles into the jar for years, and so far, you’ve only found a few blue marbles. Someone might be tempted to declare that finding a blue marble is incredibly rare, maybe a “once in 1,000 scoops” event. But if you’ve only scooped 50 times, that conclusion is, at best, premature.
The same principle applies to weather extremes. We’re dealing with relatively short periods of data collection, and because of this, we’ve barely begun to fill in the rare “blue marbles” of extreme weather events.
Yet, the media and even some scientists act as if we’ve already mapped out the entire distribution.
Insufficient Historical Data
The fundamental problem is that we simply don’t have enough real-time data over a long enough period to make robust claims about the frequency of such rare events.
Meteorological records only go back about 150 years in most regions, and high-quality, granular data is even more recent. This is a far cry from the 500 or 1,000 years needed to reliably estimate the occurrence of these so-called extreme events.
Imagine trying to estimate the frequency of a rare weather event from a dataset that covers less than one-third of the time required to make a “500-year” claim.
The statistical uncertainty becomes massive, and any declaration about a “1,000-year event” becomes almost meaningless in this context. You’d need far more observations of such rare events to make even a modest claim with confidence.
As I detailed in a piece titled ‘Smoothing the past…‘, historical weather and climate data are often manipulated to fit a specific narrative. We see the same tendency in the reporting of extreme weather events.
The narrative tends to oversimplify, smoothing over uncertainty to deliver a dramatic headline without fully understanding the statistical limitations.
Irrational Fear is written by climatologist Dr. Matthew Wielicki and is reader-supported. If you value what you have read here, please consider subscribing and supporting the work that goes into it.
Read rest at Irrational Fear
The very same suckers who call for all this Global Warming/Climate Change Poppycock are the same ones who like to recite Gores stupid Poem during their Earth Day Celebrations
So C02 by itself controls the temperature of the 4.2K cubic kilometers of earth’s atmosphere. Interestingly even the ‘Alarmists’ refer to it as a trace gas.