GLOBAL CLIMATE CHANGE: Statistical Expectations and Humanistic Perspective (Revision 1)

Fundamental Limitations in IPCC Global Warming Report

Citation
, XML
Authors

Abstract

Notwithstanding a prodigious international effort to assess global climate change, I identify three fundamental limitations in the IPCC global-warming report: (1) Its validity is tied directly to data normalized to the year 1750, which allows for different perspectives arising from other referents, (2) dependence on data-normalization is understated in the report, and (3) day-to-day humanistic considerations and environmental stresses might be more acute and chronic than global climate change.

While written a year or two before the CRU ClimateGate scandal broke out, this Knol has advised caution on giving too much reliance on the reported IPCC data. This version is effectively a 2010 new-year update.

Hyperlinks have been added to facilitate navigation back and forth throughout this Knol, revised for 1 January 2010.

Global Climate Change:

Statistical Expectations and Humanistic Perspective 

Table of Contents(Hyperlinked)

GENERALITIES ABOUT IPCC LIMITATIONS

      Normalization

      Limitations of Data-Normalization

      Radiative-Forcing Inferences

      Daubert Standard

      Limitations Not Fully and Candidly Disclosed

SCIENTIFIC CONCERNS ABOUT IPCC DATA NORMALIZATION

CONCLUSIONS ABOUT THE GLOBAL-WARMING TREND

HUMANISTIC BALANCE

ACKNOWLEDGMENTS

NOTES

 

Global climate is a “hot button” issue, a crise par jour, that inspires and justifies even more opinions.  So here’s another evaluation, not from a declared “advocate” or a “denier,” but from an experienced physicist, a conditioned “skeptic.” Although I have not personally carried out research in this area, I have had considerable experience with technical-data treatment and statistical estimation.

      After extensive inquiry and discussion, with great respect and admiration for the periodic multi-disciplinary reviews of the Intergovernmental Panel on Climate Change (IPCC),[1] here  presented are some observations and reservations distilled from technical readings, experimental background, and collegial conversations.  While this analysis propels me upstream against increasingly conventionalized wisdom, many esteemed colleagues have been there before.  None of my comments should be taken to favor or criticize any other aspect or merit of the comprehensive IPCC reports or findings.

      The IPCC’s most-recent assessment of ambient conditions has made a huge and justifiable impression with policymakers and the public about climate and environmental awareness.

      Nevertheless, here’s the “but”: I find three contextual shortcomings in the IPCC climate-change trend assessment and its public acceptance:

> Validity that is implied/inferred/extended beyond inherent data-normalization limitations.

> Understatement of their dependence on data-normalization.

> Insufficient humanistic balance.

      On the first point, the limitations of IPCC normalization need to be explicitly framed and better understood in terms of quantitative-measurement scientific methodology.

      As for IPCC understatement, surprisingly inconspicuous is their explicit acknowledgment of the dependence and limitations of derived climate-change estimation.

      Thirdly, humanist considerations are an unavoidable externality, irrespective of technical issues.  What resources should be allocated at the expense of other pressing needs?

      These specific limitations of the IPCC-interpreted global-warming trend are discussed below, initially as “Generalities,” followed by supporting “Specifics” about data-normalization.  None of these deficiencies invalidate the IPCC assessment, but they do provide grounds for caution in formulating policies and allocating resources based on the assessment.

      This Knol was slightly revised at the beginning of the new year 2010, with hyperlinks inserted to assist navigation back and forth through the document.

Return to Table of Contents 

  

      GENERALITIES ABOUT IPCC LIMITATIONS

I don’t dispute 20th Century manifestations of globally-integrated warming — that the averaged world temperature has been increasing in modern times; rather, I raise questions about whether anthropogenic-driven radiative-forcing (RF) factors are accurately implicated as causative.

Normalization.  As applied in physics and stochastics, “normalization” refers to a well-established, commonly used, and valid means of extracting statistically meaningful trends from complex datasets.  While the normalization process improves mathematical precision (repeatability) for the trend, its systematic dependency (accuracy) is tied to the referent (on which it is entirely dependent: data collected at or about “1750,” as variously declared in the IPCC compendium).

Limitations of Data-Normalization.  Contrary to oft-publicized impressions being drawn from cumulative global-warming reports, IPCC quantitative projections of greenhouse-effect are, with due deference, neither absolute nor incontestable.  The key radiative-forcing parameter derived from the IPCC climatological trend analysis is trammeled — that is, analytically tied — to the nominal year 1750; so by choosing this specific date to orient its assessment, their analysis has inherent limitations that effectively underrate alternative explanations for global-climate reckoning.

      In carrying out an explicit diagnosis of IPCC methodology, I have gleaned both strengths and boundaries of the immense compilation, as well as a partial understanding of why it has drawn criticism as well as praise.

      Although the IPCC evaluation represents an assiduous contemporary analysis of global-climate changes, methodological limitations (not necessarily inaccuracies) account for some controversy surrounding the international assessment and its interpretation.  The many thousands of pages in several volumes and supporting reports that contain complex terminology make it difficult for nonprofessionals, as well as for professionals of relevant disciplines, to carry out incisive analysis.

      Moreover, formalistic, emotional, and institutional barriers exist that inhibit the publication of critiques in traditional media.

      In order to extract a very small effect from a large body of variables, the IPCC had to “normalize” its current results by comparing the data with a selected point in time past.  For lack of a better choice, “the beginning of the industrial era [1750]” was chosen by the IPCC for referencing its RF data because that is about the time when instrumented measurements of climate-related parameters became available.

Radiative-Forcing Inferences.  The IPCC is not yet at the point where they can determine the current radiative forcing value with high accuracy and precision from contemporary physical measurements and computational models; instead, they estimated an increase since 1750 of about 0.12% (that is, 1.6 W m-2 + 50% at a 90% confidence level) in radiative forcing.

      Had the IPCC chosen a much earlier reference point, say more than 7,000 years ago, the accuracy and precision assigned to the small change in calculated radiative-forcing would have been overwhelmed by uncertainties in estimating the radiative flux so far back in time.

      (Also, as it turns out, normalization back that far would indicate a long-term net decrease in global temperature in the intervening 7,000 years).

      The IPCC information-treatment process must be understood in context.  Their estimation of global warming attributable to anthropogenic radiative forcing is comparatively small in absolute terms: a fraction (about 0.12%) of the Sun’s global irradiance.  Therefore, in order for the IPCC to extract a sufficiently precise RF value that would be actionable for global-policymaking purposes, a normalization process had to be introduced so that longer-term uncertainties would not overload the very small, but statistically decipherable RF increase inferred for the past quarter millennium.

      By choosing normalization, a commonly accepted data-treatment mechanism that circumvents statistical limitations, IPCC specialists avoid having their results overwhelmed by dependencies and uncertainties in proxy values of solar irradiance.  Because of much larger incertitude in available estimates of paleoclimatic radiative forcing, current data was normalized to the far more proximate year-1750 point-of-reference.

      Aside from the previously mentioned questions about IPCC analytical methodology, there is an ill-timed happenstance with their adjustment to the beginning of the industrial era: Global temperature was then comparatively low, which renders subsequent values to appear markedly higher.

      IPCC global-climate trend conclusions are thus explicitly limited by the choice, quality, and normalization of selected and analyzed data (and by computational models that bridge gaps in knowledge).

Daubert Standard.  As a suitable standard for scientific methodology, I employ the U.S.  Supreme Court “Daubert” decision[2], along with the Court’s supporting documents for data-treatment guidelines.  The 1993 decision, based on the Court’s assessment of modern scientific practice and logic, revised federal standards for forensic testimony.  For areas of science that require an explicit estimate of probabilistic error, the “Daubert” ruling was that quantifiable evidence should meet four “scientific method” standards, namely peer review, replicability, documentation, and stated rates of error.

Limitations Not Fully and Candidly Disclosed.  Close examination of the IPCC global-climate evaluation reveals descriptive shortcomings that are not consistent with established scientific methodology nor with the IPCC’s own guidelines in treatment of data uncertainties.  These shortcomings make it difficult for nonspecialists to evaluate and validate the IPCC conclusions.

      Although the data-treatment procedure itself is well grounded, limitations on global-warming estimates are not expressed unambiguously throughout the scientific report, nor – more importantly – are they expressed in language and placement that shed light on their relational nature.  Even though it would seem that error quantification is attended to by explicitly defined, standardized, and frequent use of probabilistic terms such as “unlikely,” “likely,” and “very likely,” these qualifiers only apply to the normalized results.  The IPCC does not make an absolute, unfettered determination of climate-change.

      Simply stated, the IPCC is exuding more confidence than justified.

      Another deficiency is the absence of probabilistic error bars accompanying prominent graphic presentations that illustrate global-warming physics processes and feedback mechanisms.  A cautionary flag should raised whenever quantitative error ranges are omitted for bar graphs and other data representations, as well as for word-descriptive results.  These might have been “dumbed-down” for decision makers.

      Having personally had professional experience [3] in absolute and relative measurements of physical parameters, the omission of data-accuracy estimates drew my attention particularly to contextual significance of the IPCC report.  In dialog with professional colleagues, I subsequently found that they too were not aware of its dependency on trend analysis with attendant limitations.  While trend analysis is a widely used analytical procedure to isolate a particular effect, the resultant should be more conspicuously associated with its probabilistic limitations.

      Taking everything into account within the IPCC report, insufficient attention has been given to stochastic shortcomings: Systematic uncertainties for the end result are not fully identified and assessed.

      This perspective, modestly offered on behalf of quantitative metrology, does not appear to be readily and overtly appreciated by all interested and affected parties: the public, the newsmedia, policymakers, and even many scientists.

Return to Table of Contents

  

      SCIENTIFIC CONCERNS ABOUT IPCC DATA NORMALIZATION

Having supplied the preceding topical overview, we can now get into referenced specifics about scientific limitations in published IPCC global-climate-change estimates.

      The Executive Summary to the IPCC’s Chapter 2, “Changes in Atmospheric Constituents and in Radiative Forcing,” contains the following footnote [4] (emphasis added):

The RF represents the stratospherically adjusted radiative flux change evaluated at the tropopause….  RF here refers to global mean RF….  Positive RFs lead to a global mean surface warming….  Radiative forcings are calculated in various ways depending on the agent: from changes in emissions and/or changes in concentrations, and from observations and other knowledge of climate change drivers.  In this report, the RF value for each agent is reported as the difference in RF, unless otherwise mentioned, between the present day (approximately 2005) and the beginning of the industrial era (approximately 1750), and is given in units of W m-2.

      In order to rectify the RF data, the IPCC chose an “industrial era” reference point because that is more or less when instrumented measurements of climate-related parameters became available.[5]

  As mentioned, “normalization,” sometimes called “trend analysis,” is a process of extracting statistically meaningful trends from complex data.[6] In order to improve trend precision, systematic dependency (accuracy) must be tied to a referent, in this case data stated variously in the IPCC report to be at “1750,” at “approximately 1750,” or “at the beginning of the industrial era.”

      The IPCC adjustment process must be placed in context: Net anthropogenic radiative forcing is comparatively small in absolute terms compared to global solar irradiance and intervening climatological phenomena.  Yet, the following unconditional sentence is found in the Fourth Assessment Report about changes in radiative forcing:[7]

The combined net RF estimate for all anthropogenic drivers has a value of +1.6 W m-2 with a 0.6 to 2.4 W m-2 90% confidence range.

      Thus, their RF estimate [8] constitutes a small fraction (about 0.12%) of the Sun’s global irradiance [9], [10] which itself has been a source of significant variability.[11] Intervening uncertain and complex orbital, atmospheric, and climatological processes are 1000-fold more significant.  For the IPCC to have derived a fractional RF value directly from basic principles and parameters of climatology is manifestly beyond the capability of current analysis and modeling.  In addition, challenges have been mounted as to the specific choice of referent; for example, Gerald E.  Marsh in Physics & Society,[12] examined comparative climate stability going back for about ½ billion years, tracking the planerozoic eon.

      Instead, the IPCC carried out an analytical process, of a kind often utilized in physics and metrology, such that uncertainties resulting from combinations of many or larger variables would not mask the 255-year RF change.

      Qualifying terms — such as “at,” “since,” “relative to” 1750 — are indeed included frequently in the IPCC report and its summaries in notations about human activities, solar irradiance, climate change and radiative forcing.[13] In taking the “beginning of the industrial era” as a reference point, the IPCC thereby avoided having its results overwhelmed by uncertainties in proxy values of solar irradiance [14] or, alternatively, by potential dependency accruing from even larger uncertainties introduced from a paleoclimatic baseline.[15]

      Although industrial-era normalization is a productive heuristic tool, the IPCC assessment of global-climate change lacks quantitative analysis for sensitivity to other referents, even though earlier paleoclimatic intervals were thoroughly examined in their report.[16] By selecting a specific touchstone, any interpretation of current global climate trends is subordinated and subject to being meaningfully altered if different referents were chosen.  A parametric analysis would help clarify RF sensitivity to alternative antecedents.

      IPCC global climate-trend conclusions are therefore circumscribed by the selected normalization.  Moreover, the contemporary (2005) IPCC radiative-forcing value is not ab initio, that is, not derived from non-dependent data.[17]

      IPCC report Chapter 1, where “Treatment of Uncertainties” is discussed,[18] could lead those who are not diacritical to a misleading judgement about IPCC terminology that defines and applies terms such as “unlikely,” “likely,” “very likely,” and “expert judgment.”[19] Because their climate projection is based entirely on trend analysis tied to the industrial era,[20] the data treatment and inherent inaccuracy fundamentally constrain IPCC conclusions.[21] This limitation is, figuratively, “buried” in fine print.[22]

      So the question arises, why is the differencing process, as such, not at least explicitly stated and candidly discussed in the main body of Chapter 2? Although phrases comparing 2005 data with 1750 data are frequently included in text, tables and figures, the only explicit acknowledgment — that the key radiation-forcing factor used to anticipate present-day global-climate trends has been normalized — is found solely in the aforementioned fine-print footnote to an Executive Summary, seemingly as an afterthought.[23]

      On the other hand, the IPCC’s guidance manual for treatment of uncertainties [24] recommends that “All authors …  should be specific as possible throughout the report about the kinds of uncertainties affecting their conclusions and the nature of any probabilities given.” The manual advises Working Group authors to “…identify the most important factors and uncertainties that are likely to affect the conclusions [and] specify which important factors/variables are being treated exogenously or fixed….” When discussing “value choices” [e.g., fixed parameters], it states that “such value choices should be treated parametrically so that decision makers can see the implications of adopting different value judgments.” The IPCC report does not conform to its own guidance in this respect.

      Understatement of normalization (a critical condition for deriving their resultant RF value) leaves open the impression that the IPCC report is mission-directed, even self-fulfilling, rather than methodologically-oriented.  An outside reviewer might very well have insisted on insertion of explicit caveats in such locations as the Preface, in Chapter 1 (“Historical Overview of Climate Change Science”), in “Frequently Asked Questions,” in “Summary for Policymakers,” and in the corresponding “Synthesis Report.” Lost were opportunity and obligation to explain the necessity for exogenous dataset normalization and its inherent limitations on conclusions.

      As a result of the preceding considerations, the causes and remedies for global climate change should be viewed as substantially more uncertain than acknowledged by the IPCC.  Notwithstanding that uncertainty, a United Nations news announcement — accompanying issuance of the international assessment — asserted “Evidence is now ‘unequivocal’ that humans are causing global warming.”[25]

      The crucial (previously quoted) unconditional footnote from the Chapter 2 Executive Summary in the IPCC Fourth Assessment Report could, in my opinion, be restated to read somewhat as follows, with emphasis added to stress context-advisable qualifiers:

The combined net RF estimate for all anthropogenic drivers, when normalized to the beginning of the “industrial period,” is 1.6 W m-2 larger than its estimated value in 1750, with a 90% precisional confidence range 0.6 to 2.4 W m-2 for the 255-year trend analysis.

      Recall that the RF value is a small fractional difference in a natural solar-insolation process that is globally integrated and multi-variate (orbital, spatial, temporal, atmospheric, chemical, spectral).  Nevertheless, the IPCC published its RF result with a high stated precision (“90% confidence range”), ignoring potential systematic uncertainty accruing from data tied to the industrial period.  By not fully clarifying limitations of confidence, the public and its policymakers are not receiving complete disclosure of restrictions inherent in the survey.[26]

      Moreover, the fact that global-warming extrapolations are entirely dependent on (an extremely detailed and praiseworthy) exogenous trend analysis is a condition deeply buried in the IPCC report, nowhere acknowledged in policymaker language.

Return to Table of Contents

 

CONCLUSIONS ABOUT THE GLOBAL-WARMING TREND

With regard to overly embracing inferences drawn from the IPCC report about global-warming magnitude and tendency, I have five specific demurrals (reservations):

(1) The crucial derived radiation-forcing for global climate change is not absolute; it is, in fact, relative (normalized) to a fixed value a quarter-century earlier.

(2) If other contingent normalization intervals — hundreds, thousands, or millions of years earlier — were selected, the global climate-change context would appear significantly different.

(3) IPCC reports and public statements vastly understate their dependence on the process of data normalization and the choice of referent.

(4) The IPCC failed to follow its own guidelines for treatment and reporting of value choices, parametric dependencies, and explicit statement of uncertainties.

(5) Various endogenic and exogenic policy-oriented statements stimulated by the IPCC lack alternative humanistically balanced perspectives.

      Even while appreciating the comprehensive compilation and assimilation of global-environment data by the IPCC, there is room for reservation and debate about the level of statistical confidence assigned to their climate-change projections.

      As others have argued, anthropogenic contributions to global ambiance in the last few hundred years might have less to do with CO2 emissions than with some factor correlated with human population growth; or, current climate change might simply be a not-so-abnormal naturally-caused, solar-driven blip on a much wider planetary time scale.

      Notwithstanding impressive “trend analysis” by thousands of conscientious scientists after a decade of exhaustive examination, the best causative interpretation that now can be quantitatively ascribed for human contribution to global warming is, “it depends.”  It depends substantially on validity of a less-than-candid IPCC industrial-era normalization practice that avoids the much larger uncertainties that would accrue if paleoclimatic referents were adopted.

      Inasmuch as the chosen year-1750 IPCC temperature normalization falls within the final decades of the Little Ice Age, present-day warming patterns taken out of longer-term context appear deceptively determinative in short-term graphs (which have a characteristic “hockey-stick” shape).

      In any event, even if there is a 20th-century global-warming correlation with anthropogenic radiative forcings, a broader longitudinal and humanistic perspective would be advisable for policymaking purposes.

Return to Table of Contents

 

HUMANISTIC BALANCE

As for public-media impressions that global warming is an immediate and dire threat to the planet, here’s a broader, humanistic perspective: Whatever impact global climate change might eventually have, policymakers are obligated to prioritize the present-day urgency and burden of Earth-bound problems — widespread hunger, war, crime, health deficiencies, pestilence, water shortages, pollution, injustice, overpopulation, deforestation, and soil degradation — to highlight some troubling aspects of the current human condition.[27]

      Here’s a paraphrased quote from an environmentally aware colleague:

If global warming fears are uncertain or overblown, perhaps we could begin to concentrate on environmental issues of which we are certain, namely:  pollution and toxification of the air, land, and water, as well as habitat destruction and species extinctions – all driven largely by a half-century of on-going human-population explosion.

      I agree.  In my opinion, public-tax money should be spent on implementing ecologically sustainable infrastructure and on rationally attenuating growth of the human population.

      Contemporary human and environmental stresses are being slighted whenever resources, such as public taxes and private donations, are allocated to what might merely be a marginal prospect for rectifying anticipated harmful climate effects.

      Although much lobbying has emerged for human resources to be committed to global-warming remediation, hopefully some on-topic, specific response will be generated by my critique.  Socially committed scientists can contribute an independent examination of the foundations for climatological diagnosis and consequence.

      In any event, whatever the validity of climate-trend analysis, there are significant competing humanitarian options for policymakers to prioritize as a result of acute and chronic distress besetting the existing human and environmental condition.

Return to Table of Contents 

ACKNOWLEDGMENTS

I am indebted to my Argonne National Laboratory colleagues, especially Jerry Marsh and George Stanford, for their inspiration and independent insight into the science behind global climate-change studies.  In particular, I have been observing with increased conviction that decreased time-delay-correlated sunspot activity is the driving force behind the past decade’s apparent statistically significant leveling off of measured average global temperature.  In other words, global temperatures indeed appear to be dependent on sun irradiance (while CO2 emissions have continued upward).

      Although a personal disclaimer might do little to avoid being simplistically labeled, I have wavered between agnosticism and skepticism regarding the significance of anthropogenic contributions to global warming.  I am neither a “believer” nor a “denier.” 

      Of course there is ongoing global climate change, but is it significantly influenced by human activities?  Originally uncertain, I now lean toward skepticism that humans are significantly responsible for the globally averaged climate change.

      In any event, in addition to ad-hoc natural-crisis management, humanity has many competing ills with which it must cope.

Return to Table of Contents 

NOTES

[1] Working Group I Report, “The Physical Science Basis,” in Climate Change 2007, the Fourth Assessment Report (AR4) of the United Nations Intergovernmental Panel on Climate Change (2007), containing 11 Chapters and supplementary material.  [Also within Climate Change 2007 are Working Group II Report “Impacts, Adaptation and Vulnerability” and Working Group III Report “Mitigation of Climate Change.”

[2] In 1993 the U.S.  Supreme Court revised the federal judicial standards for testimony regarding areas of science that required an explicit estimate of probabilistic error.  Daubert v. Merrell Dow Pharmaceuticals ruled that quantifiable evidence should meet a “scientific method” standard.  More specifically, the Daubert decision called for the admissibility of expert testimony to be based on those standards, key among them being whether the testimony is connected explicitly to a testable hypothesis, and whether there is a known or potential error associated with the evidence.

[3] My scientific career included engagement in applied-nuclear experiments, to measure the absolute value of the number of neutrons (nu) emitted by 252Cf, with the reported result subjected to intense peer analysis and reevaluation.  (Other measurements had been made to normalize neutron-yield values of the fissile isotopes 233U, 235U and 239Pu, which themselves had been determined to high precision relative to the measured absolute value for nu(252Cf) within the range of measurement error.)  In carrying out this published research, I accumulated intensive experience in radiation metrology, especially for the radioactive isotope Mn-56, and in radiation-coincidence and correlation/cross-correlation analysis, both in that work and in subsequent technical evaluations.  All of this required rigorous attention to estimation of systematic and random error.

[4] Climate Change 2007, Ch. 2, p. 131

[5] Shorter-term climate-related trends, mostly initiated with the beginning of the satellite era (~1979) are also found throughout the IPCC reports.

[6] Although the term “normalization” can be found with various usages, the process itself is well established in many scientific fields for the purpose of analyzing exclusionary differences or trends.  For example, in epidemiology the referent is usually labeled a “control group,” and factors extraneous to the control are called “confounding variables.”

[7] Ch. 2, p. 200

[8] In accordance with the stated IPCC error-range estimate, there remains about 5% expectation that combined anthropogenic drivers have a null effect (no net influence on global climate), and about 2% possibility that human activity offsets an otherwise natural rise in global temperature.

[9] Insolation, integrated over mean astronomical distance and bandwidth, is the dominant source of external energy reaching the Earth’s atmosphere.

[10] Insolation has been measured since 1978 by different satellites, with an annual average of 1366.1 + 0.5 W m-2 (1 standard deviation).  The IPCC reports the “current estimate” variously as 1370 W m-2 and 1365 W m-2 without error ranges.  These measurement results differ among themselves by more than the Panel’s estimate of combined anthropogenic-driven radiative forcing (+1.6 W m-2 ) and stated uncertainty of +0.6 to +2.4 W m-2 (90% confidence range).

[11]: Nicola Scafetta and Bruce J. West, “Is climate sensitive to solar variability?”, Physics Today (March 2008).

[12] Gerald E. Marsh, “Climate Stability and Policy,” Physics & Society 37(2):4-9 (April 2008).

[13] Only in the Summary for Policymakers [p. 2 (Footnote 2)] have I found the following clarifying footnote:

Radiative forcing is a measure of the influence that a factor has in altering the balance of incoming and outgoing energy in the Earth-atmosphere system and is an index of the importance of the factor as a potential climate change mechanism.  Positive forcing tends to warm the surface while negative forcing tends to cool it.  In this report, radiative forcing values are for 2005 relative to pre-industrial conditions defined at 1750 and are expressed in watts per square meter (W m-2).  See Glossary and Section 2.2 for further details.

[14] Latitudinal RF varies by up to 400 W m-2.

[15] One example is the onset of the last major ice age, about 116,000 years ago, when the 65-degree north-latitude mid-June climate-driving insolation was lower by ~40 W m-2 than today.  [Ch. 2, p. 445]

[16] Especially in Chapter 6, “Paleoclimate,” and Chapter 9, “Understanding and Attributing Climate Change.”

[17] Resultant limitations on forecasting greenhouse-effect trends are not explicit in IPCC reports.

[18] Ch. 1, p. 118

[19] “Guidance Notes for Lead Authors of the IPCC Fourth Assessment Report on Addressing Uncertainties,” p. 4

[20] I have endeavored without success to find absolute-error estimates of radiative forcing in the IPCC report.

[21] In nuclear metrology, results are explicitly conditioned and qualified in order to differentiate precision from accuracy, which have standardized meanings and expressions in scientific methodology.

[22] Finding explicit acknowledgment in the Fourth Assessment Report — that the IPCC value for RF and its stated precision is the result of normalization — has eluded me.  Other than in the Executive Summary of Chapter 2, it does not seem to be mentioned, except for the following oblique instance: In their Figure 2.20, a bargraph is labeled “Radiative forcing of climate between 1750 and 2005.” No context is provided in that Figure to indicate that the radiative-forcing values are limited to their relational foundation, which does not take into account pre-industrial-age natural phenomena.

[23] Nor is normalization acknowledged in FAQ 2.1, Box 1: “What is Radiative Force?” As for Section 2.9, “Synthesis,” it is not expressed in Subsection 2.8.1, “Uncertainties in Radiative Forcing,” although normalization is implied within figure and table captions in Subsection 2.9.2, “Global Mean Radiative Forcing.”

[24] Richard H. Moss and Stephen H.  Schneider, “Uncertainties in the IPCC TAR: Recommendations to Lead Authors for More Consistent Assessment and Reporting,” (undated)

[25] http://www.un.org/apps/news/printnewsAr.asp?nid=21429

[26] Googling the Internet (24 Dec. 2007) for key words, such as “since 1750” and “relative to 1750” – coupled to “climate change,” “solar irradiance” and “radiative forcing” – failed to uncover any public examples of reports or critiques that recognize these relational limitations regarding IPCC reports.

[27] An escalating example of induced humanistic distress resulting from insufficiently considered, environmentally driven policy is the subsidized diversion of corn production from food-crop supply to ethano- transportation fuel. 

Return to Table of Contents