This model approaches the question of “how is the total Greenhouse effect modified by variations in concentrations of atmospheric Greenhouse gases?”.
The model attempts to quantify the amount that the total Greenhouse effect has been augmented by the higher levels of various different Greenhouse gases in the atmosphere since pre-industrial times.
is to treat the world as it is and then look if there is any relationship between changes in the total Greenhouse effect and other observed metrics.
These observed metrics including global atmospheric temperatures at different altitudes, average global concentrations of Greenhouse gases and world population. These are all 'good' numbers for which reliable and consistent monthy values are available.
They may be subject to minor measurement inaccuracies but they are the best available.
I have accumulated these data from various sources: temperatures (pre-satellite) from Berkely Earth and post satellite from link https://www.drroyspencer.com/latest-global-temperatures/
Levels of greenhouse gases are principally from the NOAA facility at Mauna Loa link https://gml.noaa.gov/ccgg/trends/ with additional historic data from a variety of sources.
Some data has been taken from the IPCC AR6 report - their latest link https://www.ipcc.ch/assessment-report/ar6/..
No prior causal relationship has been assumed.
The models of the Anointed (see elsewhere on this website for use of the term ‘Anointed’ after Thomas Sowell) all start out by incorporating, axiomatically, the causal relationship ‘Anthropogenic emissions cause Global Warming’.
This leads to the logic trap of circular reasoning. Circular reasoning or ‘confirmatory modelling’ starts out at the finishing line by assuming the proposed causality step is true. The modelling then proceeds to prove that the starting assertion is valid - which is the only outcome the modelling can validate.
In mathemeticians language any model that axiomatically contain an assertion can only verify that the assertion is true. A bit like pulling you up by your own bootlaces (original meaning).
This is classic circular reasoning and is so logically flawed that it is incredible to me that scientists accept these models have any validity.
The genesis of the current 'Climate Change' modelling is rooted on a coincidence.
It was observed that there was a spell of hot weather - particulary the hot Northern hemisphere summer of 1976. At the same time it was noted that carbon dioxide levels had increased.
This led to some people (notably Hanson & Bolin - see What is Climate Change? ) to postulate 'it's obviously carbon dioxide causing this hot weather'. They went on to make outrageous predictions to instill fear into a gullible audience. These prediction all turned out to be wrong.
The rest of the sad tale of 'Climate Change', as they say, is history (see The rise of Climate Change Dogma).
And it is all based on a foundless coincidental postulate. This postulate is based on the anthropocentric idea that humans and our industrialising nations are to blame.
This leads to accusations of guilt as thought these industrialising nations deliberately caused a 'Climate emergency' and must, therefore, be punished.
It tries to describe how increased levels of Greenhouse gases and many other factors might be associated with increase in the total Greenhouse effect – as observed.
The Greenhouse Effect introduces some of the concepts that are modelled using empirical parameters in the Cardinal Model.
Included in the Cardinal Model, for example, are the globally observed (monthly) average carbon dioxide level. Other atmospheric gases observed ground-level concentrations are included. The linking of these concentrations to the total Greenhouse effect is done through a series of empirical parameters.
Empirical parameter modelling of observations is the process of building a mathematical model based on observed data, rather than on a priori physical or theoretical principles. The parameters in such models are quantifiable measures derived from observations and are used to fit the model to the data and characterise how the ‘system under consideration’ works.
The parameters link to concepts like effective density, Planck radiation, photon flux density, effective emission altitude, saturation zone, broadening/overlap zone, atmospheric layering, Arrhenius and other form responses some of which are derivd from consideration of Schwarzschild’s Radiative Transfer Equation.
Theses are all described in The Greenhouse Effect - Technical References and Backup
The model proceeds to find values for all these empirical parameters by a process multivariate analysis and variance minimisation – trying to fit the model’s outcome to the real world – as observed.
The Cardinal model incorporates 35 parameters which have been found to be non-trivial.
The ‘goodness of fit’ of the model with its derived empirical parameters to the real world is nearly perfect.
Because of this near-perfect fit, I have a very high confidence that the values obtained for the empirical parameters are a good and valid representation of reality.
That the data used in the model contains slight measurement error (particularly for inhomogeneously distributed gases) is inevitable but these errors have only a neglgible effect on the empirical parameters values that make this 'near perfect' fit reassuring.
Increasing the number of empirical parameters in order to provide a better fit found no non-trivial additions.
The fit is so close (monthly error of 0.003WM-2 or less) to the observed total greenhouse effect that there is little room for increased accuracy.
Some Specific results from Step 3 The Outcomes - Summary are:
Carbon Dioxide ‘doubling’: an extra 0.366 WM-2 since pre-industrial times; Methane ‘tripling’: 0.094 WM-2 since pre-industrial times; Other Greenhouse gas ‘doublings’ <0.311 WM-2 since pre-industrial times
Total ‘doubling’/’tripling’: <0.771 WM-2 since pre-industrial times; Total from pre-industrial times to 2025: <0.653 WM-2; Total from 2025 to ‘doubling’/’tripling’ ~ 0.118 WM-2.