Elsevier

Environmental Modelling & Software

Volume 59, September 2014, Pages 10-29
Environmental Modelling & Software

Identifying parametric controls and dependencies in integrated assessment models using global sensitivity analysis

https://doi.org/10.1016/j.envsoft.2014.05.001Get rights and content

Highlights

  • We present Sobol' sensitivity analyses of three climate policy scenarios.

  • Parameter interactions are significant, evolve over time, and vary by policy.

  • Key interactions and sensitivities misclassified in expert elicited assessments.

  • Climate damages not sufficiently sensitive to climate/carbon cycle parameters.

  • Costs of aggressive mitigation controlled by participation and renewables.

Abstract

Integrated assessment models for climate change (IAMs) couple representations of economic and natural systems to identify and evaluate strategies for managing the effects of global climate change. In this study we subject three policy scenarios from the globally-aggregated Dynamic Integrated model of Climate and the Economy IAM to a comprehensive global sensitivity analysis using Sobol' variance decomposition. We focus on cost metrics representing diversions of economic resources from global world production. Our study illustrates how the sensitivity ranking of model parameters differs for alternative cost metrics, over time, and for different emission control strategies. This study contributes a comprehensive illustration of the negative consequences associated with using a priori expert elicitations to reduce the set of parameters analyzed in IAM uncertainty analysis. The results also provide a strong argument for conducting comprehensive model diagnostics for IAMs that explicitly account for the parameter interactions between the coupled natural and economic system components.

Introduction

Climate change is one of the most challenging issues confronting the scientific and policy communities. The National Research Council (NRC, 2009) has called for advances in climate change decision support that facilitate a “deliberation with analysis” approach to the problem. A key aspect of “deliberation with analysis” is the need for frameworks that aid in identifying the key uncertainties influencing the trade-off between near-term carbon dioxide (CO2) mitigation costs and long-term risks posed by climate change. A large body of literature has emerged seeking to better characterize this trade-off using integrated assessment models (IAMs) (Parson and Fisher-Vanden, 1997, Kelly and Kolstad, 1999). IAMs seek to inform our understanding of the coupled natural and economic systems that shape mitigation and adaptation decisions. More formally, Kelly and Kolstad (1999) define an IAM as “… any model which combines scientific and socio-economic aspects of climate change primarily for the purpose of assessing policy options for climate change control”. For evaluating climate mitigation strategies, IAMs must incorporate important aspects of the climate system and the global economy, and yet be sufficiently transparent to be useful for decision support (Kelly and Kolstad, 1999, Stanton et al., 2009). For IAMs to be useful they need to advance our understanding of the linkages between economic activities, greenhouse gas emissions, the carbon cycle, climate and damages (Parson and Fisher-Vanden, 1997, Courtois, 2004, Stanton et al., 2009, Weyant, 2009). Broadly there are two classes of IAMs (Stanton et al., 2009): (1) inter-temporal optimization models, and (2) simulation models. Inter-temporal optimization models seek to identify a best future course based on global/regional welfare or cost optimization. Optimality is typically defined in this class of IAMs subject to an assumption of perfect foresight and the IAM modeler's expected state-of-the-world (SOW). Simulation (or evaluation) models, instead, play out specific policy scenarios over time without explicitly defining or seeking optimality. Both of these classes of IAMs are nonlinear and require large numbers of externally-specified (exogenous) parameters to abstract the economic and natural systems being modeled.

IAMs are now garnering significant roles in shaping climate change impact projections and in the formulation of alternative mitigation policies (IPCC, 1996, Stern, 2007, EPA, 2010, EPA, 2013, UNEP, 2010, UNEP, 2011, NRC, 2011, Rogelj et al., 2011, Rogelj et al., 2013a, Rogelj et al., 2013b). Many agencies (EPA, 2009, EU, 2009) recommend that all models used for policy development and analysis, including IAMs, be rigorously evaluated. The challenges of evaluating IAMs, as has been reviewed over two decades (Risbey et al., 1996, Stanton et al., 2009, Schwanitz, 2013), include the potentially high degrees of model complexity, the degree of integration and resolution of model components, and incomplete knowledge of underlying processes and data. Efforts to model the inherently unknown future behavior of complex, inter-related systems have led to a focus on the uncertainties associated with framing possible futures. This is often done in the context of community model inter-comparison exercises (e.g., Clarke et al., 2009). Our study builds on additional guidance from broader environmental modeling communities for improving diagnostic assessments of complex environmental modeling systems (e.g., Jakeman et al., 2006, Gupta et al., 2008, Gudmundsson et al., 2012; Kelly (Letcher) et al., Kelly et al., 2013, Baroni and Tarantola, 2014).

Recently, Schwanitz (2013) outlines an evaluation framework specifically for the IAM community. Included as one of the tools in this evaluation framework, global sensitivity analysis has the potential to attribute the uncertainty in an IAM's projections to its parameters, both individually and collectively (Saltelli et al., 2008). To date, sensitivity analyses of IAMs focused on specific functions or modules within a given model (Keller et al., 2004, Gillingham et al., 2008, Ackerman et al., 2010) or on exploiting expert elicitations to reduce the set of parameters to be analyzed with a local sensitivity analysis (Peck and Teisberg, 1993, Prinn et al., 1999, Toth et al., 2003). Recent studies that have applied global statistical sampling to IAMs still confine sensitivity testing to a small subset of parameters within a limited Monte Carlo sampling (Pizer, 1999, Scott et al., 1999, Goodess et al., 2003, Campolongo et al., 2007, Nordhaus, 1994, Nordhaus, 2008, Kypreos, 2008, Johansson, 2011). Overall these analyses overlook the potential for multiple parameters in an IAM to interactively influence the outcomes and, consequently, may lead to incorrect inferences as to which parameters or factors most strongly influence key uncertainties (Saltelli and D'Hombres, 2010).

We focus our sensitivity analysis on the globally-aggregated IAM, the Dynamic Integrated model of Climate and the Economy (DICE) (Nordhaus, 1994, Nordhaus and Boyer, 2000, Nordhaus, 2008), and extend the uncertainty and sensitivity analysis reported in Nordhaus (2008). Our purpose is to demonstrate that for IAMs, i.e., non-linear models with many exogenous parameters, the uncertainties of model outputs can arise from complex parameter interactions. DICE presents a simple, yet comprehensive, representation of the world where alternative economy-climate scenarios can be tested without having to explicitly model the complexities of the global system. There are multiple potential foci when designing a global sensitivity analysis of an inter-temporal optimization IAM. The choice of the appropriate experimental approach depends on the overall policy question to be answered. For example, one question that might be explored is, how do scenario pathways for a given stabilization goal change across alternative SOWs? This problem is reflective of the majority of IAM studies where the primary focus is on comparing the resulting optimized policy scenario outcomes. Alternatively, we pursue in this study the question, how vulnerable are specific optimized DICE policy scenarios to uncertainties in the exogenous assumptions? By isolating the policy scenarios from the optimization process, we are exploring which exogenous parameters (e.g., population growth, technology efficiency, climate sensitivity) control deviations from the policy costs attained under the assumption of perfect information. We do not recalibrate the model to external data sources for each sampled SOW, do not re-optimize the model for each sampled SOW, and do not claim to assign likelihoods to exogenous parameter combinations. Rather we measure how exogenous parameters, individually and interactively, affect selected policy-relevant model outputs. For a deterministic, perfect foresight model such as DICE, it is arguably quite useful to know the vulnerabilities of a policy solution and to identify the key model parameters that control its performance over time. Our results could also inform subsequent calibration efforts or uncertainty analyses by giving an improved a posteriori understanding of complex, interactive parametric effects.

Here we use the cost benefit form (see Section 2.2 below) of the DICE model as described in Nordhaus (2008). In this form of the model a policy scenario outcome is characterized by the control variables, emission control rates and investment, which optimize the objective function, the sum of the discounted utility of consumption over time, given the constraints applied, such as available fossil fuel resources and limits to atmospheric temperature increases. Emission pathways are endogenous in this form of the model. A different (cost effectiveness) form of this model is employed for the use of pre-specified emission control pathways (Meinshausen et al., 2011a, Rogelj et al., 2012). See Appendix Fig. A.9 for an example of a DICE policy scenario and resulting emissions pathway.

For this study we construct a simulation version of DICE, called CDICE, which reproduces DICE model outcomes for a supplied policy scenario, given the reference values of all exogenous parameters. With this simulation model, we can explore the vulnerability of a fixed policy scenario to the uncertainty in the DICE model's exogenous parameters. We choose three distinctly different DICE policy scenarios to see how parametric sensitivities change for scenarios with different treatments of the trade-offs between climate damages and abatement costs. In this study, we apply the Sobol' method, a global variance-based sensitivity analysis method (Sobol', 2001, Saltelli et al., 2008), to CDICE simulations of each policy scenario. Using the Sobol' method, we choose model outputs (in this case, climate damages and abatement costs) for the analysis. We create ensembles of these model outputs by iteratively running the CDICE simulation model while simultaneously varying a selection of model parameters over specified ranges using Sobol' quasi-random sampling. The Sobol' method is used to decompose the variance of the damage and abatement cost outputs into portions contributed individually or interactively by the sampled parameters.

This exercise demonstrates the importance of understanding the non-separable, interactive parameter dependencies that control uncertain IAM projections. We also contrast our findings with the more typical local sensitivity analysis as performed in Nordhaus (2008). Our results illustrate the consequences of using a priori expert elicitations to reduce the set of parameters analyzed, especially within the context of a one-at-a-time (OAT) sensitivity analysis. The results of this global sensitivity analysis provide a strong argument for comprehensive model diagnostics for IAMs to explicitly account for the parametric interactions between their coupled natural and economic components. Moreover, this study illustrates how the sensitivity ranking of model parameters differs for alternative cost metrics, over time, and for alternative emission control strategies.

In Section 2 we describe the DICE IAM and the CDICE simulation model as well as the policy scenarios used in this study. Section 3 presents the methods used and descriptions of the computation experiments. Results and implications are discussed in Section 4, followed by conclusions in Section 5.

Section snippets

The DICE model

Fig. 1 provides a schematic overview of the DICE IAM. In this study, we use version 2007.delta.8b, which is documented in detail in Nordhaus (2008), and was obtained from the author's website (http://nordhaus.econ.yale.edu) in February 2011. The model presents a neoclassical economic growth theory view of the economics of climate change (Nordhaus, 2008). This version of the DICE model builds on more than twenty years of development of a conceptually simple, yet complete, example of a fully

Sobol' sensitivity analysis

We apply a global sensitivity analysis based on the variance decomposition method proposed by Sobol' (2001) and described by Saltelli et al. (2008) to assess the sensitivity in CDICE of the DICE policy scenarios to (1) the exogenous model parameters identified as having the most influence on the model in Nordhaus (2008) and (2) an extended list of model parameters (see Section 3.2 for details). The Sobol' method (Sobol', 2001, Saltelli et al., 2008) decomposes the variance of model outputs from

OAT Sensitivities of the CDICE model outputs

Fig. 3 verifies that we are able to match the Nordhaus OAT-based sensitivity rankings (Nordhaus, 2008, Chapter 7) for the parameters controlling the increase in atmospheric temperature in the first decade of the 22nd century in DICE (Fig. 3A) and CDICE (Fig. 3B) using the BAU policy scenario. The atmospheric temperature increase using reference values of all parameters is 3.2 °C for DICE (Nordhaus, 2008) and 3.1 °C for CDICE, illustrating that the CDICE simulation captures the atmospheric

Conclusions

This study contributes a detailed demonstration of the consequences of using expert elicitations to narrow the set of parameters used for sensitivity and uncertainty analyses of IAMs. We demonstrate how local approaches, such as OAT analysis with a small parameter set varied over a small part of the feasible parameter space, can mis-classify key sensitivities. As a result, research investments to reduce uncertainties guided by an OAT analysis could be biased, and early-warning signs of policy

Acknowledgments

This work was supported by the U.S. Department of Energy, Office of Science, Biological and Environmental Research Program, Integrated Assessment Research Program, Grant No. DE-SC0005171, with additional support from NSF through the Network for Sustainable Climate Risk Management (SCRiM) under NSF cooperative agreement GEO-1240507 and the Penn State Center for Climate Risk Management. The authors thank William Nordhaus for making the DICE model available, and Alex Libardoni, Chris Forest and

References (77)

  • W.D. Nordhaus

    Rolling the ‘DICE’: an optimal transition path for controlling greenhouse gases

    Resour. Energy Econ.

    (1993)
  • S.C. Peck et al.

    CO2 emissions control: comparing policy instruments

    Energy Policy

    (1993)
  • W.A. Pizer

    The optimal choice of climate change policy in the presence of uncertainty

    Resour. Energy Econ.

    (1999)
  • E. Plischke et al.

    Global sensitivity measures from given data

    Eur. J. Oper. Res.

    (2013)
  • A. Saltelli

    Making best use of model evaluations to compute sensitivity indices

    Comput. Phys. Commun.

    (2002)
  • A. Saltelli et al.

    Sensitivity analysis didn't help. A practitioner's critique of the Stern review

    Glob. Environ. Change

    (2010)
  • V.J. Schwanitz

    Evaluating integrated assessment models of global climate change

    Environ. Model. Softw.

    (2013)
  • M.J. Scott et al.

    Uncertainty in integrated assessment models: modeling with MiniCAM 1.0

    Energy Policy

    (1999)
  • I.M. Sobol'

    On the distribution of points in a cube and the approximate evaluation of integrals

    USSR Comput. Math. Math. Phys.

    (1967)
  • I.M. Sobol'

    Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates

    Math. Comput. Simul.

    (2001)
  • B. Anderson et al.

    Uncertainty in climate change modeling: can global sensitivity analysis be of help?

    Risk Anal.

    (2013)
  • T.H. Andres

    Sampling methods and sensitivity analysis for large parameter sets

    J. Stat. Comput. Simul.

    (1997)
  • G.E.B. Archer et al.

    Sensitivity measures, ANOVA-like techniques and the use of bootstrap

    J. Stat. Comput. Simul.

    (1997)
  • R.W. Bodman et al.

    Uncertainty in temperature projections reduced using carbon cycle and climate observations

    Nat. Clim. Change

    (2013)
  • G.E.P. Box et al.

    Statistics for Experimenters: Design, Innovation, and Discovery

    (2005)
  • EIA

    International Energy Statistics: Carbon Intensity using Purchasing Power Parities

    (2012)
  • EPA

    Guidance on the Development, Evaluation, and Application of Environmental Models

    (2009)
  • EPA

    Social Cost of Carbon for Regulatory Impact Analysis

    (2010)
  • EPA

    Technical Update of the Social Cost of Carbon for Regulatory Impact Analysis

    (2013)
  • EU

    Impact Assessment Guidelines

    (2009)
  • C.M. Goodess et al.

    Representing climate and extreme weather events in integrated assessment models: a review of existing methods and options for development

    Integr. Assess.

    (2003)
  • L. Gudmundsson et al.

    Evaluation of nine large-scale hydrological models with respect to seasonal runoff climatology in Europe

    Water Resour. Res.

    (2012)
  • H.V. Gupta et al.

    Reconciling theory with observations: elements of a diagnostic approach to model evaluation

    Hydrol. Process.

    (2008)
  • IEA

    Key World Energy Statistics

    (2010)
  • IIASA

    Probabilistic World Population Projections

    (2007)
  • IPCC

    Climate Change 1995: Economic and Social Dimensions of Climate Change, Contribution of Working Group III to the Second Assessment Report

    (1996)
  • IPCC

    Climate Change 2007: the Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report

    (2007)
  • D.J.A. Johansson

    Temperature stabilization, ocean heat uptake and radiative forcing overshoot profiles

    Clim. Change

    (2011)
  • Cited by (69)

    • Interoperability and computational framework for simulating open channel hydraulics: Application to sensitivity analysis and calibration of Gironde Estuary model

      2022, Environmental Modelling and Software
      Citation Excerpt :

      For this purpose, radial convergence diagrams, also called chord graphs, are used to simultaneously visualize some computed Generalized Sensitivity Indices (Fig. 10). These diagrams plot the main effect of each input variable (first-order multivariate sensitivity analysis proportional to the size of the inner circle); its total influence, including interactions (proportional to the size of the outer circle), existence and extent of second-order effects (second-order multivariate sensitivity indices proportional to width) (Butler et al., 2014). Fig. 10 displays sensitivity analysis estimates from one L2-discrepancy optimized LHS sample used to compute the min-max confidence intervals presented in Table 2.

    • Global sensitivity analysis for optimal climate policies: Finding what truly matters

      2021, Economic Modelling
      Citation Excerpt :

      Along with the traditional, variance-based measures, they compute somewhat less common, distribution-based sensitivity measures, which are robust to monotonic transformations of the input-output relationships. Butler et al. (2014) use variance decomposition to analyze the vulnerability of policy scenarios in DICE to uncertainty in a subset of its parameters. Both studies adopt the common practice of generating large Monte Carlo samples of input–output data to estimate sensitivity indices.

    View all citing articles on Scopus
    View full text