In 2015, a member of Congress asked my university to investigate me based on testimony I had recently given before House and Senate committees.1
In that testimony, I summarized the conclusions of recent reports of the Intergovernmental on Climate Change (IPCC) related to the detection and attribution of trends in extreme weather and disasters. THB readers know the data and literature well, as it is a topic I discuss often. [emphasis, links added]
The congressman apparently did not like my testimony so he claimed in a letter to my university demanding that I be investigated that I was perhaps taking undisclosed money from Exxon in exchange for my testimony — where I expressed views aligned with the IPCC and based on my work in the peer-reviewed literature, which was also cited by the IPCC.
If this sounds ridiculous, well, it was.
My university duly investigated me, found nothing of course, the Congressman quietly abandoned the investigation, and a year later an organized campaign against me and my work was revealed in the Wikileaks emails released prior to the 2016 U.S. election (I’ll leave that story for another time).
I was reminded by this experience while reading an excellent new essay by Ted Nordhaus of The Breakthrough Institute2 published today by The New Atlantis, titled — Did Exxon Make it Rain Today?
Nordhaus does a nice job explaining that disasters occur at the confluence of an extreme event and an exposed and vulnerable society, but most attention these days is paid to extreme events, and climate change in particular:
What determines whether hurricanes, floods, heat waves, and wildfires amount to natural disasters or minor nuisances, though, is mostly not the relative intensity or frequency of the natural hazard but rather how many people are in harm’s way and how well protected they are against the climate’s extremes.
Infrastructure, institutions, and technology mediate the relationship between extreme climate and weather phenomena, and the costs that human societies bear as a result of them. . .
The implications of this point will be counterintuitive for many. Yes, there are many types of disasters, like hurricanes and floods, that are causing greater economic costs in many places than they used to. But this is almost entirely because the places that are most exposed to weather disasters have far more people and far more wealth in harm’s way than they used to. Even if there were no global warming, in other words, these areas would be much more at risk simply because they have much more to lose.
However, what I find really interesting about Nordhaus’ essay is his discussion of how we got to a point where leading journalists and scientists are seeking to deny these rather obvious conditions and instead, focus obsessively on human-caused climate change, specifically on the fossil fuel industry as bearing responsibility for increasing disaster costs, contrary to an overwhelming scientific consensus.
Nordhaus explains that climate advocates have a long history of trying to tie disasters to climate change, dating back decades:
Those efforts intensified after Hurricane Katrina struck New Orleans in 2005, with Al Gore using it as a centerpiece in An Inconvenient Truth.3 A few years later, in 2012, the Union of Concerned Scientists convened a gathering of environmental advocates, litigators, climate scientists, and opinion researchers in La Jolla, California. Their explicit purpose was to develop a public narrative connecting extreme weather events that were already happening, and the damages they were causing, with climate change and the fossil fuel industry.
The proceedings from that gathering, which were subsequently published in a report titled “Establishing Accountability for Climate Change Damages: Lessons from Tobacco Control,” are revealing.
The IPCC, over decades of reports, has not concluded with high confidence that a signal of human-caused climate change can be detected for most types of extreme weather, especially those that result in the greatest impacts. That remains the case today.
For those wanting to promote climate action using contemporary disasters as a reason to act, the IPCC’s consistent conclusions — no matter how deeply buried in its reports — present a problem.
In a 2018 survey of environmental journalists … Seventy-one percent reported that they never or rarely included opposing viewpoints in their coverage of climate change.
So alternative facts needed to be created. Nordhaus explains:
Myles Allen, the climate scientist who is credited with creating the field of “extreme event attribution,” is described in the report as lamenting that “the scientific community has frequently been guilty of talking about the climate of the twenty-second century rather than what’s happening now.” Yet, he and other scientists at the gathering also acknowledged how difficult it is to identify the contributions of climate change to current extreme weather events. “If you want to have statistically significant results about what has already happened,” another scientist, Claudia Tebaldi, noted, “we are far from being able to say anything definitive because the signal is so often overwhelmed by noise.”
While much of the convening was ostensibly focused on litigation strategies, modeled on campaigns against the tobacco industry, the subtext of the entire conversation was how to raise the public salience of a risk that is diffuse, perceived to be far off in time and space, and associated with activities — the combustion of fossil fuels — that bring significant social benefits.
Nordhaus explains that a three-pronged strategy emerged from the 2012 meeting — lowering scientific standards (from those of the IPCC) to enable stronger claims, redefining the attribution of causality differently than the IPCC, and emphasizing the villainous nature of fossil fuel companies to give people an enemy:
During the meeting, Naomi Oreskes, the Harvard historian of science who popularized the connection between climate and tobacco, argued that scientists should use a different standard of proof for the relationship between climate change and extreme weather events.
“When we take these things to the public,” she argued, “we take a standard of evidence applied internally to science and use it externally.” But, she continued, the 95-percent confidence standard that scientists use “is not the Eleventh Commandment. There is nothing in nature that taught us that 95 percent is needed. That is a social convention.”
Others suggested that reframing the attribution of extreme weather to climate change could allow for stronger claims: rather than looking at whether there was any long-term detectable trend in extreme weather, scientists might instead focus on the degree to which climate change increased the likelihood of a given extreme event. And others believed that focusing legal strategies on a villain — fossil fuel companies conspiring to mislead the public about the danger of their product — would result in greater public acceptance of the claims that climate change was the cause of extreme weather.
As it happened, environmental advocates would pursue all of these strategies.
Nordhaus further explains that broader changes in the media occurred at a perfect time to boost these strategies aimed at creating a new narrative:
Not so long ago, news coverage needed to be credible to multiple audiences whose politics and values spanned a relatively broad spectrum of worldviews and values. But the proliferation of media outlets and platforms in recent decades, first with the rise of cable news and then the Internet, has increasingly fragmented media audiences.
Today, media outlets large and small compete in a far more crowded marketplace to reach much narrower segments of the population. This incentivizes them to tailor their content to the social and political values of their audiences and serve up spectacles that comport with the ideological preferences of the audiences they are trying to reach. For the audiences that elite legacy outlets such as the New York Times now almost exclusively cater to, that means producing a continual stream of catastrophic climate news.
I suspect that the only place that most of you reading this will encounter Nordhaus’ essay is right here at THB.4
Reporters on the “climate beat” know very well that acknowledging the existence of Nordhaus’ essay or the arguments he makes might offend the politics of their employers, readers, and colleagues.
Nordhaus explains that a large majority of environmental journalists refuse to engage in narrative-challenging viewpoints (emphasis added):
Reporters and editors at these outlets are also well-aligned ideologically with their audiences. A national survey of political journalists and editors working for newspapers at the state and national level conducted in 2022 found that those identifying as Democrats outnumbered those identifying as Republicans by 10 to 1. A 2018 survey of environmental journalists by George Mason University’s Center for Climate Change Communication, meanwhile, found that 70 percent reported trusting information from environmental advocacy organizations versus fewer than 10 percent from business groups.
Seventy-one percent reported that they never or rarely included opposing viewpoints in their coverage of climate change.
If I get into my mental time machine and go back to 2013-2015 when I presented my Congressional testimony that led me to be attacked by the White House (another story for another time) and then investigated by a member of Congress, it is clear that I was simply an inconvenient scholar presenting uncomfortable knowledge with a prominent platform.
The research that I and colleagues had been doing for the previous few decades — no matter how accurate or well-cited — needed to be removed from the playing field in favor of alternative facts. That strategy had the further advantage of not having to take on the IPCC directly.
For anyone who takes on the alternative facts? Well, they are obviously a climate denier and probably also taking money from Exxon.5 Honestly, it has been a hugely successful campaign.
Even so, I remain optimistic that good science wins out over alternative facts, even if that process takes a frustratingly long while.
I applaud the IPCC’s Working Group 1 for steadfastly playing things straight on extreme event detection and attribution,6 and also The Breakthrough Institute’s Ted Nordhaus (and Patrick Brown) for promoting good science in the face of what surely are significant social and professional pressures.
Head over to The New Atlantis and read Nordhaus’ entire piece — it is well worth your time. I invite you back to comment on it and to discuss.
The Honest Broker is written by climate expert Roger Pielke Jr and is reader-supported. If you value what you have read here, please consider subscribing and supporting his work.
About RPJ: Roger Pielke Jr. has been a professor at the University of Colorado since 2001. Previously, he was a staff scientist in the Environmental and Societal Impacts Group of the National Center for Atmospheric Research. He has degrees in mathematics, public policy, and political science, and is the author of numerous books. (Amazon).
Read more at The Honest Broker
I had an excellent physics teacher who told us that all systems tend towards zero potential energy. I’ve never seen him proven wrong.
I think you could call it “entropy”. Without an input of energy from outside the system will tend to disorder. Luckily the planet earth gets energy pumped into it from the sun. Also we can tap into energy from the earth (oil, gas, coal, uranium, …) to hold off the entropy on the surface.
I emailed the 5 people who made the 3-2 SEC ruling dictating that businesses must now report “greenhouse gas” emissions.
Chair@sec.gov
CommissionerPeirce@sec.gov
CommissionerCrenshaw@sec.gov
CommissionerUyeda@sec.gov
CommissionerLizarraga@sec.gov
Those are on the SEC website, so no problem publicizing them.
––––––––––––––-*-
Title: Your decisions should at least hew to the fundamental physical laws.
Greetings.
{ First, a note: Read this message (and the attached PDF file) in full… you’ll find all of the mathematics work out, all of the conclusions hew to the fundamental physical laws, and thus the entire premise of your recent ruling is therefore incorrect. }
Your 3-2 decision to require businesses to report ‘greenhouse gas’ emissions is based upon CAGW deeming CO2 to be a deleterious molecule to the climate (upon which the EPA based their CO2 Endangerment Finding, and upon which you based your ruling)… except the CAGW hypothesis has been debunked. Hence all the downstream findings and decisions based upon that premise of CAGW deeming CO2 to be a deleterious molecule to the climate are fallacious.
In the attached paper, I definitively, mathematically disprove the CAGW (Catastrophic Anthropogenic Global Warming, due to CO2) hypothesis; I prove it is brought about via a misuse of the Stefan-Boltzmann (S-B) equation; I use the “Earth Energy Balance” graphic from Kiehl-Trenberth (it and all subsequent similar graphics represent the mathematics used in Energy Balance Climate Models) as an empirical example of this mathematical proof; I further prove that what the climatologists claim to be happening blatantly violates 2LoT (2nd Law of Thermodynamics) and Stefan’s Law, and is hence unphysical.
The climatologists have misused the Stefan-Boltzmann (S-B) equation (and the fundamental physical laws), and in the process, have practically flipped reality on its head… polyatomics (CO2, H2O, etc.) are not “global warming gases”, they are net atmospheric radiative coolants (radiative emission to space being the only way that Earth can shed energy); monoatomics (Ar) are not inert gases that have no effect upon climate, they are the actual “greenhouse gases” (because they cannot emit IR, and thus cannot shed energy to space… they dilute the radiative coolant gases); homonuclear diatomics (N2, O2) are somewhere in between… they can radiatively emit IR (and thus shed energy from the system known as ‘Earth’), but only under certain conditions (collisional perturbation of their net-zero electric dipole, which is why homonuclear diatomic vibrational mode quantum states are meta-stable and relatively long-lived. Collisions happen exponentially less frequently as altitude increases), and thus are “greenhouse gases” like the monoatomics, just not to the same extent.
Think about it this way… we all know the air warms up during the daytime as the planet’s surface absorbs energy from the sun. Conduction of that energy when air contacts the planet’s surface is the major reason air warms up.
How does that ~99% of the atmosphere (N2, O2, Ar) cool down? It cannot effectively radiatively emit.
Convection moves energy around in the atmosphere, but it cannot shed energy to space. Conduction depends upon thermal contact with other matter and since space is essentially a vacuum, conduction cannot shed energy to space… this leaves only radiative emission. The only way our planet can shed energy is via radiative emission to space. Fully ~76.2% of all surface energy is removed via convection, advection and evaporation. The surface only radiatively emits ~23.8% of all surface energy to space. That ~76.2% must be emitted to space by the atmosphere.
Thus, common sense dictates that the thermal energy of the constituents of the atmosphere which cannot effectively radiatively emit (N2, O2, Ar) must be transferred to the so-called ‘greenhouse gases’ (CO2 being a lesser contributor below the tropopause and the largest contributor above the tropopause, water vapor being the main contributor below the tropopause) which can radiatively emit and thus shed that energy to space. Peer-reviewed studies corroborating this are referenced in the attached file.
So, far from being ‘greenhouse gases’ which ‘trap heat’ in the atmosphere, those polyatomic radiative gases actually shed energy from the atmosphere to space. They are net atmospheric radiative coolants.
In short, in an atmosphere sufficiently dense such that collisional energy transfer can significantly occur, all radiative molecules play the part of atmospheric coolants at and above the temperature at which the combined translational mode energy of two colliding molecules exceeds the lowest vibrational mode quantum state energy of the radiative molecule. Below this temperature, they act to warm the atmosphere via thermalization (the mechanism the climate alarmists claim happens all the time), but if that occurs below the tropopause, the net result is an increase of Convective Available Potential Energy, which increases convection, which is a net cooling process. It is a gradation… as temperature increases, so too does the population of vibrationally excited polyatomics. For CO2, that ‘transition temperature’ (the temperature at which the molecule transitions from being ‘net warmant’ to ‘net coolant’ and vice versa) is ~288 K.
The climatologists only told everyone half the story (thermalization by CO2 via vibrational mode to translation mode (v-t) energy transfer collisional processes. They didn’t tell us about the inverse… translational mode to vibrational mode (t-v) energy transfer collisional processes (then that energy being radiatively emitted to space), which is a cooling process. That part didn’t fit their doomsaying narrative, so they left it out.
https://web.archive.org/web/20220521192232if_/https://i.imgur.com/CxVTcro.png
Now, on to how the climatologists were able to flip reality on its head…
Essentially, the climatologists are treating real-world graybody objects as though they are idealized blackbody objects… with emission to 0 K and emissivity of 1 (sometimes… other times they slap emissivity onto the idealized blackbody form of the S-B equation while still assuming emission to 0 K… which is still a misuse of the S-B equation, for graybody objects).
This essentially isolates each object into its own system so it cannot interact with other objects via the ambient EM field, which grossly inflates radiant exitance of all objects, necessitating that the climatologists carry these incorrect values through their calculation and cancel them on the back end (to get their equations to balance) by subtracting a wholly-fictive ‘cooler to warmer’ energy flow from the real (but far too high because it was calculated for emission to 0 K) ‘warmer to cooler’ energy flow.
That wholly-fictive ‘cooler to warmer’ energy flow is otherwise known as ‘backradiation’… a mathematical artifact due to that aforementioned misuse of the S-B equation.
As I show in the attached paper, the correct usage of the S-B equation is via subtracting cooler object energy density from warmer object energy density to arrive at the energy density gradient, which determines radiant exitance of the warmer object.
2LoT in the Clausius Statement sense states that system energy cannot spontaneously flow up an energy density gradient (remember that while 2LoT in the Clausius Statement sense only mentions temperature, temperature is a measure of energy density, equal to the fourth root of energy density divided by Stefan’s Constant, per Stefan’s Law), that it requires “some other change, connected therewith, occurring at the same time“… that “some other change” typically being external energy doing work upon that system energy to pump it up the energy density gradient (which is what occurs in, for example, AC units and refrigerators).
The “backradiation” claim by the climatologists implies that energy can spontaneously flow up an energy density gradient… just one of many blatant violations of the fundamental physical laws inherent in the CAGW narrative. As I show in the attached paper, this is directly analogous to claiming that water can spontaneously flow uphill (ie: up a pressure gradient).
In other words, the entirety of the CAGW industry is built upon a foundation of mathematical fraudery, and we’re all being lied to. Given that the climatologists are purportedly highly educated, there’s no way they’d slip up on such an elementary issue… ergo, it must be intentional deception. The only other possible explanation is profound incompetence on the part of the climatologists.
This means that the offshoots of CAGW… social cost of carbon, net zero, carbon capture and sequestration, carbon credit trading, etc. are as equally useless as CAGW… because the proper interpretation of the fundamental physical laws and the proper application of the S-B equation shows that CO2 is a net atmospheric radiative coolant (two peer-reviewed empirical studies are referenced in the attached paper corroborating this), not a “global warming gas”.
This doesn’t just apply to CO2, however. It applies to all atmospheric polyatomic molecules (in fact, far from the ‘global warming gas’ claimed by the climatologists, water acts as a literal refrigerant (in the strict ‘refrigeration cycle’ sense) below the tropopause, as I show in the attached paper). So while the climate catastrophists are attempting to shift from CO2 to CH4 (methane) as their climate bogeyman, it won’t work because their narrative still relies upon that same misinterpretation of the fundamental physical laws and misuse of the S-B equation.
These concepts used to be common knowledge. Somewhere along the way, the concepts got skewed to fit a particular narrative. Eventually, the concepts described herein will be common knowledge again, whereupon CAGW and its offshoots will be dumped on the midden heap of bad scientific ideas.
If you’ve an impartial physicist available, have them review the attached paper… you’ll find everything I write hews to the fundamental physical laws, and uses bog-standard thermodynamics, radiative theory, cavity theory, dimensional analysis, electrical theory and quantum theory.
Then ask a climatologist… they’ll claim it’s junk science… except it was all taken directly from physics tomes, and the climatologist is just attempting to protect their gravy train. Then ask the climatologist how their latest “Earth Energy Balance” graphic (remember that the “Earth Energy Balance” graphic represents the mathematics used in their Energy Balance Climate Models) could possibly arrive at 398 W m-2 surface radiant exitance at their claimed 288 K average global temperature… that’s not even physically possible (you can do the calculation using the S-B equation to see this for yourself). Watch as they hem and haw in attempting to explain their junk science.
It’s time to end the CAGW sham, and all of its offshoots.
Don’t think that you’re the only ones who are getting this information… it is going out to businesses across the US. A legal challenge to your ruling utilizing this information will strike it down, and in the process destroy all of CAGW and its offshoots.
––––––––––––––-*-
If anyone wants the full paper, it’s here:
https://ufile.io/gb1xn4lh
TMI
I’ve been trying to simplify it for a layperson audience, but it’s not easy… one must have at least a superficial grounding in thermodynamics, radative theory, dimensional analysis and cavity theory; and (if one wishes to use the analogization of thermodynamics to electrical theory) electrical theory and (if one wants to get down into the mechanics of why energy only spontaneously flows down an energy density gradient, never up it) quantum theory.
I have been monitoring the climate change fraud since before extreme weather events were blamed on anthropological climate change. There was a period of time where it seemed as if there was an increase of extreme events. However, during this same period there also appeared to be an increase in devastating earthquakes. A Russian scientist had a plausible explanation. He said that these events were not more common, but the better world wide communication made it appear that they were. Another factor may have been involved called statistical clustering. This term has more than one meaning but I’m referring to an incident where a coin is flipped may times and heads and tails come up about equal, then there is a run of heads coming up eight times in a row. This can also happen with the climate.
The earth wasn’t and isn’t warming as predicted by the computer models, so the activists needed a new reason to implement “rapid, far-reaching and unprecedented changes in all aspects of society.” Claiming an increase in extreme weather was a welcome means to continue their agendas. Even though these events are not increasing, the activists and their media stooges are not going to let go of it.
The “power of suggestion” influences people’s beliefs and behaviour. Superstitions are an example. Twice a day on digital-faced clocks 11:11 will appear. Someone started a website that focused on the “phenomenon” .
People fell for it. Suggestion is used by therapists in a helpful way. Devious megalomaniacs are using the internet to push climate fear like a drum beat.
I note in the article: “The first part of this notion is unquestionably true: the planet is warming due to human activities, primarily carbon dioxide emissions.”
In the past week or so I’ve seen several articles about the earth’s albedo and how that is changing. Fewer clouds also result in rising temperatures. As more research is done it might turn out to be neither of these possibilities are correct.
But carbon dioxide in the atmosphere? Has been steadily increasing for the past 60+ years, while the instrumental ‘average global temperature’ has risen, and has dropped, or has been stable for an extended period.
What physical mechanism(s) is employed by human emissions of CO2 to cause this climate change that is so terrible and getting worse? I can only think of one and that is its ability to increase energy in the climate system which would be seen as increased temperature. Consider “On the existence of a tropical hot spot, and EPA’s CO2 endangerment findings” by Wallace and others. This extensive analysis finds no significant statistical evidence in 14 temperature records that CO2 is the cause of the slight recent warming in any of them. It also finds no evidence of the Tropical hot spot found in every global circulation model. It seems to me that these findings preclude any attribution of extreme weather to human CO2 emissions. Mr. Allen needs to falsify Wallace et. al. before continuing his attribution work.
They claim it comes about via thermalization (CO2 absorbs 14.98352 µm radiation, collides with another atmospheric atom or molecule, and in a vibrational mode-to-translational mode (v-t) energy transfer process, increases the translational mode energy of that other atom or molecule, and given that temperature is a measure of the translational mode energy of the atomic or molecular constituents of a gas, that increases temperature.
What they’re not telling everyone is the other side of the story… that of (t-v) collisional processes, and (t&v-v) collisional processes.
For a TL;DR, go to the last paragraph.
In dealing with solely translational mode energy and neglecting vibrational mode and rotational mode energy for the moment, the Equipartition Theorem states that molecules in thermal equilibrium have the same average energy associated with each of three independent degrees of freedom, equal to:
3/2 kT per molecule, where k = Boltzmann’s Constant
3/2 RT per mole, where R = gas constant
Thus the Equipartition Theorem equation:
KE_avg = 3/2 kT
… serves well in the definition of kinetic energy (which we sense as temperature).
It does not do as well at defining the specific heat capacity of polyatomic gases, simply because it does not take into account the increase of internal molecular energy via vibrational mode and rotational mode excitation (specific heat capacity). Energy imparted to the molecule via either photon absorption or collisional energetic exchange can excite those vibrational mode or rotational mode quantum states, increasing the total molecular energy E_tot, but not affecting temperature at all. Since we’re only looking at translational mode energy at the moment (and not specific heat capacity); and internal molecular energy is not accounted for in measuring temperature (which is a measure of translational mode energy only), this long-known and well-proven equation fits our purpose.
Our thermometers are an instantaneous average of molecular kinetic energy. If they could respond fast enough to register every single molecule impinging upon the thermometer probe, we’d see temperature wildly jumping up and down, with a distribution equal to the Maxwell-Boltzmann Speed Distribution Function. In other words, at any given measured temperature, some molecules will be moving faster (higher temperature) and some slower (lower temperature), with an equilibrium distribution (Planckian) curve.
The Equipartition Theorem states that in Local Thermodynamic Equilibrium conditions all molecules, regardless of molecular mass, will have the same kinetic energy and therefore the same temperature. For higher atomic mass molecules, they’ll be moving slower; for lower atomic mass molecules, they’ll be moving faster; but their kinetic (translational mode) energy will all be the same at the same temperature.
For CO2, with a molecular mass of 44.0095 amu, at 288 K the molecule will have:
Most Probable Speed {(2kT/m)^1/2} = 329.8802984961799 m/s
Mean Speed {(8kT/pm)^1/2} = 372.23005645833854 m/s
Effective (rms) Speed {(3kT/m)^1/2} = 404.0195258297897 m/s
For N2, with a molecular mass of 28.014 amu, at 288 K the molecule will have:
Most Probable Speed {(2kT/m)^1/2} = 413.46812435139907 m/s
Mean Speed {(8kT/pm)^1/2} = 466.5488177761755 m/s
Effective (rms) speed {(3kT/m)^1/2} = 506.3929647832758 m/s
But if those molecules are at the exact same (kinetic) temperature, they’ll have exactly the same translational mode (kinetic) energy.
This energy at exactly 288 K is equivalent to the energy of a 33.3050 µm photon.
If two molecules collide, their translational energy is cumulative, dependent upon angle of collision. In mathematically describing the kinematics of a binary molecular collision, one can consider the relative motion of the molecules in a spatially-fixed 6N-dimensional phase space frame of reference (lab frame) which consists of 3N spatial components and 3N velocity components, to avoid the vagaries of interpreting energy transfer considered from other reference frames.
Simplistically, for a head-on collision between only two molecules, this is described by the equation:
KE = (1/2 mv^2) [molecule 1] + (1/2 mv^2) [molecule 2]
The Maxwell-Boltzmann Speed Distribution Function, taking into account 3N spatial components and 3N velocity components:
https://i.imgur.com/0ZVflnN.png
You may surmise, “But at 288 K, the combined kinetic energy of two molecules in a head-on collision isn’t sufficient to excite CO2’s lowest vibrational mode quantum state! It requires the energy equivalent to a 14.98352 µm photon to vibrationally excite CO2, and the combined translational mode energy of two molecules colliding head-on at 288 K is only equivalent to the energy of a 16.6525 µm photon!”
True, but you’ve not taken into account some mitigating factors…
1) We’re not talking about just translational mode energy, we’re talking about E_tot, the total molecular energy, including translational mode, rotational mode, vibrational mode and electronic mode. At 288 K, nearly all CO2 molecules will be excited in the rotational mode quantum state, increasing CO2’s E_tot. The higher a molecule’s E_tot, the less total energy necessary to excite any of its other modes.
2) Further, the Boltzmann Factor shows that at 288 K, ~10.26671% of N2 molecules are in the N2{v1(1)} vibrational mode quantum state. That vibrational mode energy can be transferred along with kinetic mode energy to excite CO2 to its CO2{v20(2)} vibrational mode quantum state during collision.
N2{v1(1)} (stretch) mode at 2345 cm-1 (4.26439 µm), correcting for anharmonicity, centrifugal distortion and vibro-rotational interaction
1 cm-1 = 11.9624 J mol-1
2345 cm-1 = 2345 * 11.9624 / 1000 = 28.051828 kJ mol-1
The Boltzmann factor at 288 K has the value 1 / (2805.1828 / 288R) = 0.10266710 which means that 10.26671% of N2 molecules are in the N2{v1(1)} vibrational mode quantum state.
Given that CO2 constitutes 0.041% of the atmosphere (410 ppm), and N2 constitutes 78.08% of the atmosphere (780800 ppm), this means that 80162.3936 ppm of N2 is vibrationally excited via t-v (translational-vibrational) processes at 288 K. You’ll note this equates to 195 times more vibrationally excited N2 molecules than all CO2 molecules (vibrationally excited or not).
So during a collision, some vibrational mode energy from N2{v1(1)}, and some kinetic energy from the two colliding molecules, will transfer energy to the vibrational mode quantum state of CO2, if that energy is sufficient to excite that particular vibrational mode.
Thus energy will flow from the higher-energy (and higher concentration) N2{v1(1)} molecules to CO2 molecules excited to their CO2{v20(2)} vibrational mode quantum state (and the CO2 molecule got to that vibrational mode quantum state from the CO2{v20(0)} ground state by a prior collision), exciting the CO2 to its {v3(1)} vibrational mode, whereupon it can drop to its {v1(1)} or {v20(2)} vibrational modes by emission of 9.4 µm or 10.4 µm radiation (wavelength dependent upon isotopic composition of the CO2 molecules).
What I’ve described above is what occurs in CO2 lasers… the same thing happens in the atmosphere. The only difference is how the N2 got vibrationally excited (in a lasing tube by collision with an electron, in the atmosphere by collision with solar insolation-excited O3). The energy transfer from N2 to CO2 is the same in either case.
The energy flow from translational modes of molecules to N2 vibrational mode quantum states, then to CO2 vibrational mode quantum states, then to radiation constitutes a cooling process.
––––––––––––––-*-
The Maxwell-Boltzmann Speed Distribution Function gives a wide translational mode equilibrium distribution. In order for CO2 to be vibrationally excited, it requires a minimum of the energy equivalent to a 14.98352 µm photon, equating to a CO2 speed of 425.92936688660114 m/s or an N2 speed of 533.8549080851558 m/s.
Remember I wrote above:
For CO2, with a molecular mass of 44.0095 amu, at 288 K the molecule will have:
Most Probable Speed {(2kT/m)^1/2} = 329.8802984961799 m/s
Mean Speed {(8kT/pm)^1/2} = 372.23005645833854 m/s
Effective (rms) Speed {(3kT/m)^1/2} = 404.0195258297897 m/s
For N2, with a molecular mass of 28.014 amu, at 288 K the molecule will have:
Most Probable Speed {(2kT/m)^1/2} = 413.46812435139907 m/s
Mean Speed {(8kT/pm)^1/2} = 466.5488177761755 m/s
Effective (rms) speed {(3kT/m)^1/2} = 506.3929647832758 m/s
For CO2, the Boltzmann Factor probability of one of its molecules being at a speed of 425.92936688660114 m/s; and for N2, the Boltzmann Factor probability of one of its molecules being at a speed of 533.8549080851558 m/s is 0.8461 at 288 K. In other words, for every 100 molecules which are at the Most Probable Speed, another ~84 molecules will be at the speed necessary to vibrationally excite CO2.
Thus at ~288 K and higher temperature, the combined translational mode energy of colliding atmospheric molecules begins to significantly vibrationally excite CO2, increasing the time duration during which CO2 is vibrationally excited and therefore the probability that the CO2 will radiatively emit. The conversion of translational mode to vibrational mode energy is, by definition, a cooling process. The emission of the resultant radiation to space is, by definition, a cooling process.
As CO2 concentration increases, the population of molecules able to become vibrationally excited increases, thus increasing the number of CO2 molecules able to radiatively emit, thus increasing photon flux, thus increasing energy emission to space.
As temperature increases, the population of vibrationally excited CO2 molecules increases, thus increasing the number of CO2 molecules able to radiatively emit, thus increasing photon flux, thus increasing energy emission to space.
This is why I state that CO2 becomes a net atmospheric coolant at approximately 288 K… the exact solution is near to impossible to calculate, given the nearly infinite number of angles of molecular collision, the equilibrium distribution of molecular speed, and the fact that atmospheric molecular composition varies spatially and temporally with altitude and water vapor concentration variations.
https://web.archive.org/web/20220521192232if_/https://i.imgur.com/CxVTcro.png
In an atmosphere sufficiently dense such that collisional energy transfer can significantly occur, all radiative molecules play the part of atmospheric coolants at and above the temperature at which the combined translational mode energy of two colliding molecules exceeds the lowest vibrational mode quantum state energy of the radiative molecule. Below this temperature, they act to warm the atmosphere via the mechanism the climate alarmists claim happens all the time, but if that warming mechanism occurs below the tropopause, the net result is an increase of Convective Available Potential Energy, which increases convection, which is a net cooling process.
––––––––––––––-*-
Think about it this way… we all know the air warms up during the daytime. Conduction of energy (that energy from the sun, absorbed by the planet’s surface) when air contacts the planet’s surface is the major reason air warms up.
Yet the major constituents of the atmosphere (N2 and O2) are homonuclear diatomics and thus cannot effectively radiate energy to cool down (unless their net-zero electric dipole is perturbed via collision). They constitute ~99% of the atmosphere. How does that 99% cool down?
Convection moves energy around in the atmosphere, but it cannot shed energy to space. Conduction depends upon thermal contact with other matter and since space is essentially a vacuum, conduction cannot shed energy to space… this leaves only radiative emission. The only way our planet can shed energy is via radiative emission to space.
But N2 and O2 cannot effectively radiatively emit because, being homonuclear diatomic molecules, they have no net electric dipole. Only when the molecule simultaneously collides with something else (perturbing its net-zero electric dipole) and a photon incides upon the molecule does it have any chance of absorbing radiation, and even then it doesn’t happen every single time. Same goes for radiative emission, it requires collision which perturbs the molecule’s net-zero electric dipole. That’s why homonuclear diatomic vibrational mode quantum states are meta-stable and relatively long-lived.
Thus, common sense dictates that the thermal energy of the 99% of the atmosphere which cannot effectively radiatively emit must be transferred to the so-called ‘greenhouse gases’ (CO2 being a lesser contributor in the lower atmosphere and the largest contributor in the upper atmosphere, water vapor being the main contributor in the lower atmosphere) which can radiatively emit and thus shed that energy to space.
So, far from being ‘greenhouse gases’ which ‘trap heat’ in the atmosphere, those radiative gases actually shed energy from the atmosphere to space. They are net atmospheric radiative coolants.
The IPCC Working Group isn’t “playing it straight” as regards extreme event detection and attribution… they’re biding their time until they become so entrenched that no one dare challenge them. Then you’ll see constant attribution of every bad event (weather or not) to CAGW, and enough catastrophism to choke an elephant.
Strange that the climate catastrophists claim we skeptics are engaging in “alternative facts” when we’ve got the fundamental physical laws and the proper usage of the Stefan-Boltzmann equation on our side, whereas the entirety of CAGW and all of its offshoots (climate attribution, net zero, carbon capture and sequestration, carbon credit trading, etc.) are predicated upon a provable misuse of the S-B equation and a misinterpretation of the fundamental physical laws.
https://climatechangedispatch.com/madness-continues-sec-approves-climate-rule-forcing-companies-to-disclose-emissions/#comment-66062
If anyone wants the full paper, it’s here:
https://ufile.io/gb1xn4lh
But, the tide is turning… lawmakers are receiving the paper, judges are being educated, energy company legal departments are learning how to completely quash climate lawfare lawsuits… all it takes is one company with one properly-trained physicist who knows this stuff intuitively, to get the scientific facts on record in a court of law during one of these lawfare suits, and a precedent is set that will flow out to every other similar lawfare suit and make getting them dismissed for being predicated upon fantasy fyziks a trivial matter.