An easy to understand fundamental climate sensitivity model is designed to empower those of us who are not schooled in geo_ or climate sciences. The fact that most, if not all, generated models is riddled with unfamiliar words, phrases, and well….riddles, should be and now is, brought to q…
An easy to understand fundamental climate sensitivity model is designed to empower those of us who are not schooled in geo_ or climate sciences. The fact that most, if not all, generated models is riddled with unfamiliar words, phrases, and well….riddles, should be and now is, brought to question.
Who “invented” the term ‘global warming’ might be one of the first questions. Perhaps a good second question is ….what is it based on. And the best question of all should be “can you prove it”? This is where politics-meets-the-road and science is shoved aside.
You might recognize the following statement. “If you can’t convince them…confuse them” (Harry S Truman). This statement alone captures the basic foundation of a wartime strategy outlining “Win the Hearts & Minds” of the enemy populous. In the case of global warming it would be winning over the ‘less-informed’ – hence using computer generated models filled with speculative projections at best – or intentionally manipulated deceptive formulas at worst.
In 1990, the First Assessment Report of the Intergovernmental Panel on Climate Change (IPCC) expressed “substantial confidence” that near-term global warming would occur twice as fast as subsequent observation. However, when you look at the tangible facts of “observed” verses “computer generated” projections – we find the current warming trend occurred twice as ‘slow’ as projected by the IPCC.
Given rising CO2 concentration, few models predicted no warming since 2001. Between the pre-final and published drafts of the IPCC Fifth Assessment Report, the report cut its near-term warming projection substantially. Their model indicates the reduction of transient climate sensitivity from 1.9 to 1.5 W m−2 K−1 would endorse a reduction from 3.2 to 2.2 K in its central climate-sensitivity estimate. This would suggest there is no unrealized current warming trend indicating <1 K this century – and that combustion of all recoverable fossil fuels will cause <2.2 K global warming to equilibrium.
Resolving the discrepancies between the IPCC’s Fourth and Fifth Assessment Reports are crucial in displaying overt (some would say covert) implications of desired outcomes. Once those discrepancies are taken into account, the impact of anthropogenic global warming over the next century and beyond, would reveal less than one-third to one-half of IPCC’s current projections.
The Earth’s greenhouse effect does not require complex computer models in order to calculate useful numbers for debating the issue. However, let us not forget; “if you can’t convince them…confuse them.” Placating the use of estimated conjecture by touting the use of highly complicated computer calculation, further claiming the algorithmic accuracy of the computer as should go without question should raise a very visible red flag.
For those of you who have been playing with computers beginning sometime in the early 90s might remember the acronym GIGO (Garbage In – Garbage Out). Simply put, you can write script for any desired outcome you wish. Yet this is the guideline of which all is to be measured. Heaven forbid if one is to question such methods. “The science is in, there is no debate. You are either with me – or you are”….a dirt-bag uneducated polluting sinner. Okay, the words following “or you are” belongs to me, but the rest belongs to Al Gore (not an evil man, just wrong). But you can see how such a statement underscores the technique of “winning the hearts & minds”.