Identifying parametric controls and dependencies in integrated assessment models using global sensitivity analysis
Introduction
Climate change is one of the most challenging issues confronting the scientific and policy communities. The National Research Council (NRC, 2009) has called for advances in climate change decision support that facilitate a “deliberation with analysis” approach to the problem. A key aspect of “deliberation with analysis” is the need for frameworks that aid in identifying the key uncertainties influencing the trade-off between near-term carbon dioxide (CO2) mitigation costs and long-term risks posed by climate change. A large body of literature has emerged seeking to better characterize this trade-off using integrated assessment models (IAMs) (Parson and Fisher-Vanden, 1997, Kelly and Kolstad, 1999). IAMs seek to inform our understanding of the coupled natural and economic systems that shape mitigation and adaptation decisions. More formally, Kelly and Kolstad (1999) define an IAM as “… any model which combines scientific and socio-economic aspects of climate change primarily for the purpose of assessing policy options for climate change control”. For evaluating climate mitigation strategies, IAMs must incorporate important aspects of the climate system and the global economy, and yet be sufficiently transparent to be useful for decision support (Kelly and Kolstad, 1999, Stanton et al., 2009). For IAMs to be useful they need to advance our understanding of the linkages between economic activities, greenhouse gas emissions, the carbon cycle, climate and damages (Parson and Fisher-Vanden, 1997, Courtois, 2004, Stanton et al., 2009, Weyant, 2009). Broadly there are two classes of IAMs (Stanton et al., 2009): (1) inter-temporal optimization models, and (2) simulation models. Inter-temporal optimization models seek to identify a best future course based on global/regional welfare or cost optimization. Optimality is typically defined in this class of IAMs subject to an assumption of perfect foresight and the IAM modeler's expected state-of-the-world (SOW). Simulation (or evaluation) models, instead, play out specific policy scenarios over time without explicitly defining or seeking optimality. Both of these classes of IAMs are nonlinear and require large numbers of externally-specified (exogenous) parameters to abstract the economic and natural systems being modeled.
IAMs are now garnering significant roles in shaping climate change impact projections and in the formulation of alternative mitigation policies (IPCC, 1996, Stern, 2007, EPA, 2010, EPA, 2013, UNEP, 2010, UNEP, 2011, NRC, 2011, Rogelj et al., 2011, Rogelj et al., 2013a, Rogelj et al., 2013b). Many agencies (EPA, 2009, EU, 2009) recommend that all models used for policy development and analysis, including IAMs, be rigorously evaluated. The challenges of evaluating IAMs, as has been reviewed over two decades (Risbey et al., 1996, Stanton et al., 2009, Schwanitz, 2013), include the potentially high degrees of model complexity, the degree of integration and resolution of model components, and incomplete knowledge of underlying processes and data. Efforts to model the inherently unknown future behavior of complex, inter-related systems have led to a focus on the uncertainties associated with framing possible futures. This is often done in the context of community model inter-comparison exercises (e.g., Clarke et al., 2009). Our study builds on additional guidance from broader environmental modeling communities for improving diagnostic assessments of complex environmental modeling systems (e.g., Jakeman et al., 2006, Gupta et al., 2008, Gudmundsson et al., 2012; Kelly (Letcher) et al., Kelly et al., 2013, Baroni and Tarantola, 2014).
Recently, Schwanitz (2013) outlines an evaluation framework specifically for the IAM community. Included as one of the tools in this evaluation framework, global sensitivity analysis has the potential to attribute the uncertainty in an IAM's projections to its parameters, both individually and collectively (Saltelli et al., 2008). To date, sensitivity analyses of IAMs focused on specific functions or modules within a given model (Keller et al., 2004, Gillingham et al., 2008, Ackerman et al., 2010) or on exploiting expert elicitations to reduce the set of parameters to be analyzed with a local sensitivity analysis (Peck and Teisberg, 1993, Prinn et al., 1999, Toth et al., 2003). Recent studies that have applied global statistical sampling to IAMs still confine sensitivity testing to a small subset of parameters within a limited Monte Carlo sampling (Pizer, 1999, Scott et al., 1999, Goodess et al., 2003, Campolongo et al., 2007, Nordhaus, 1994, Nordhaus, 2008, Kypreos, 2008, Johansson, 2011). Overall these analyses overlook the potential for multiple parameters in an IAM to interactively influence the outcomes and, consequently, may lead to incorrect inferences as to which parameters or factors most strongly influence key uncertainties (Saltelli and D'Hombres, 2010).
We focus our sensitivity analysis on the globally-aggregated IAM, the Dynamic Integrated model of Climate and the Economy (DICE) (Nordhaus, 1994, Nordhaus and Boyer, 2000, Nordhaus, 2008), and extend the uncertainty and sensitivity analysis reported in Nordhaus (2008). Our purpose is to demonstrate that for IAMs, i.e., non-linear models with many exogenous parameters, the uncertainties of model outputs can arise from complex parameter interactions. DICE presents a simple, yet comprehensive, representation of the world where alternative economy-climate scenarios can be tested without having to explicitly model the complexities of the global system. There are multiple potential foci when designing a global sensitivity analysis of an inter-temporal optimization IAM. The choice of the appropriate experimental approach depends on the overall policy question to be answered. For example, one question that might be explored is, how do scenario pathways for a given stabilization goal change across alternative SOWs? This problem is reflective of the majority of IAM studies where the primary focus is on comparing the resulting optimized policy scenario outcomes. Alternatively, we pursue in this study the question, how vulnerable are specific optimized DICE policy scenarios to uncertainties in the exogenous assumptions? By isolating the policy scenarios from the optimization process, we are exploring which exogenous parameters (e.g., population growth, technology efficiency, climate sensitivity) control deviations from the policy costs attained under the assumption of perfect information. We do not recalibrate the model to external data sources for each sampled SOW, do not re-optimize the model for each sampled SOW, and do not claim to assign likelihoods to exogenous parameter combinations. Rather we measure how exogenous parameters, individually and interactively, affect selected policy-relevant model outputs. For a deterministic, perfect foresight model such as DICE, it is arguably quite useful to know the vulnerabilities of a policy solution and to identify the key model parameters that control its performance over time. Our results could also inform subsequent calibration efforts or uncertainty analyses by giving an improved a posteriori understanding of complex, interactive parametric effects.
Here we use the cost benefit form (see Section 2.2 below) of the DICE model as described in Nordhaus (2008). In this form of the model a policy scenario outcome is characterized by the control variables, emission control rates and investment, which optimize the objective function, the sum of the discounted utility of consumption over time, given the constraints applied, such as available fossil fuel resources and limits to atmospheric temperature increases. Emission pathways are endogenous in this form of the model. A different (cost effectiveness) form of this model is employed for the use of pre-specified emission control pathways (Meinshausen et al., 2011a, Rogelj et al., 2012). See Appendix Fig. A.9 for an example of a DICE policy scenario and resulting emissions pathway.
For this study we construct a simulation version of DICE, called CDICE, which reproduces DICE model outcomes for a supplied policy scenario, given the reference values of all exogenous parameters. With this simulation model, we can explore the vulnerability of a fixed policy scenario to the uncertainty in the DICE model's exogenous parameters. We choose three distinctly different DICE policy scenarios to see how parametric sensitivities change for scenarios with different treatments of the trade-offs between climate damages and abatement costs. In this study, we apply the Sobol' method, a global variance-based sensitivity analysis method (Sobol', 2001, Saltelli et al., 2008), to CDICE simulations of each policy scenario. Using the Sobol' method, we choose model outputs (in this case, climate damages and abatement costs) for the analysis. We create ensembles of these model outputs by iteratively running the CDICE simulation model while simultaneously varying a selection of model parameters over specified ranges using Sobol' quasi-random sampling. The Sobol' method is used to decompose the variance of the damage and abatement cost outputs into portions contributed individually or interactively by the sampled parameters.
This exercise demonstrates the importance of understanding the non-separable, interactive parameter dependencies that control uncertain IAM projections. We also contrast our findings with the more typical local sensitivity analysis as performed in Nordhaus (2008). Our results illustrate the consequences of using a priori expert elicitations to reduce the set of parameters analyzed, especially within the context of a one-at-a-time (OAT) sensitivity analysis. The results of this global sensitivity analysis provide a strong argument for comprehensive model diagnostics for IAMs to explicitly account for the parametric interactions between their coupled natural and economic components. Moreover, this study illustrates how the sensitivity ranking of model parameters differs for alternative cost metrics, over time, and for alternative emission control strategies.
In Section 2 we describe the DICE IAM and the CDICE simulation model as well as the policy scenarios used in this study. Section 3 presents the methods used and descriptions of the computation experiments. Results and implications are discussed in Section 4, followed by conclusions in Section 5.
Section snippets
The DICE model
Fig. 1 provides a schematic overview of the DICE IAM. In this study, we use version 2007.delta.8b, which is documented in detail in Nordhaus (2008), and was obtained from the author's website (http://nordhaus.econ.yale.edu) in February 2011. The model presents a neoclassical economic growth theory view of the economics of climate change (Nordhaus, 2008). This version of the DICE model builds on more than twenty years of development of a conceptually simple, yet complete, example of a fully
Sobol' sensitivity analysis
We apply a global sensitivity analysis based on the variance decomposition method proposed by Sobol' (2001) and described by Saltelli et al. (2008) to assess the sensitivity in CDICE of the DICE policy scenarios to (1) the exogenous model parameters identified as having the most influence on the model in Nordhaus (2008) and (2) an extended list of model parameters (see Section 3.2 for details). The Sobol' method (Sobol', 2001, Saltelli et al., 2008) decomposes the variance of model outputs from
OAT Sensitivities of the CDICE model outputs
Fig. 3 verifies that we are able to match the Nordhaus OAT-based sensitivity rankings (Nordhaus, 2008, Chapter 7) for the parameters controlling the increase in atmospheric temperature in the first decade of the 22nd century in DICE (Fig. 3A) and CDICE (Fig. 3B) using the BAU policy scenario. The atmospheric temperature increase using reference values of all parameters is 3.2 °C for DICE (Nordhaus, 2008) and 3.1 °C for CDICE, illustrating that the CDICE simulation captures the atmospheric
Conclusions
This study contributes a detailed demonstration of the consequences of using expert elicitations to narrow the set of parameters used for sensitivity and uncertainty analyses of IAMs. We demonstrate how local approaches, such as OAT analysis with a small parameter set varied over a small part of the feasible parameter space, can mis-classify key sensitivities. As a result, research investments to reduce uncertainties guided by an OAT analysis could be biased, and early-warning signs of policy
Acknowledgments
This work was supported by the U.S. Department of Energy, Office of Science, Biological and Environmental Research Program, Integrated Assessment Research Program, Grant No. DE-SC0005171, with additional support from NSF through the Network for Sustainable Climate Risk Management (SCRiM) under NSF cooperative agreement GEO-1240507 and the Penn State Center for Climate Risk Management. The authors thank William Nordhaus for making the DICE model available, and Alex Libardoni, Chris Forest and
References (77)
- et al.
Fat tails, exponents, extreme uncertainty: simulating catastrophe in DICE
Ecol. Econ.
(2010) - et al.
A general probabilistic framework and global sensitivity analysis of deterministic models: a hydrological case study
Environ. Model. Softw.
(2014) - et al.
An effective screening design for sensitivity analysis of large models
Environ. Model. Softw.
(2007) - et al.
International climate policy architectures: overview of the EMF22 international scenarios
Energy Econ.
(2009) - et al.
On the sources of technological change: what do the models assume?
Energy Econ.
(2008) The status of integrated assessment in climatic policy making: an overview of inconsistencies underlying response functions
Environ. Sci. Policy
(2004)- et al.
Modeling endogenous technological change for climate policy analysis
Energy Econ.
(2008) - et al.
Ten iterative steps in development and evaluation of environmental models
Environ. Model. Softw.
(2006) - et al.
Uncertain climate thresholds and optimal economic growth
J. Environ. Econ. Manag.
(2004) - et al.
Selecting among five common modelling approaches for integrated environmental assessment and management
Environ. Model. Softw.
(2013)
Rolling the ‘DICE’: an optimal transition path for controlling greenhouse gases
Resour. Energy Econ.
CO2 emissions control: comparing policy instruments
Energy Policy
The optimal choice of climate change policy in the presence of uncertainty
Resour. Energy Econ.
Global sensitivity measures from given data
Eur. J. Oper. Res.
Making best use of model evaluations to compute sensitivity indices
Comput. Phys. Commun.
Sensitivity analysis didn't help. A practitioner's critique of the Stern review
Glob. Environ. Change
Evaluating integrated assessment models of global climate change
Environ. Model. Softw.
Uncertainty in integrated assessment models: modeling with MiniCAM 1.0
Energy Policy
On the distribution of points in a cube and the approximate evaluation of integrals
USSR Comput. Math. Math. Phys.
Global sensitivity indices for nonlinear mathematical models and their Monte Carlo estimates
Math. Comput. Simul.
Uncertainty in climate change modeling: can global sensitivity analysis be of help?
Risk Anal.
Sampling methods and sensitivity analysis for large parameter sets
J. Stat. Comput. Simul.
Sensitivity measures, ANOVA-like techniques and the use of bootstrap
J. Stat. Comput. Simul.
Uncertainty in temperature projections reduced using carbon cycle and climate observations
Nat. Clim. Change
Statistics for Experimenters: Design, Innovation, and Discovery
International Energy Statistics: Carbon Intensity using Purchasing Power Parities
Guidance on the Development, Evaluation, and Application of Environmental Models
Social Cost of Carbon for Regulatory Impact Analysis
Technical Update of the Social Cost of Carbon for Regulatory Impact Analysis
Impact Assessment Guidelines
Representing climate and extreme weather events in integrated assessment models: a review of existing methods and options for development
Integr. Assess.
Evaluation of nine large-scale hydrological models with respect to seasonal runoff climatology in Europe
Water Resour. Res.
Reconciling theory with observations: elements of a diagnostic approach to model evaluation
Hydrol. Process.
Key World Energy Statistics
Probabilistic World Population Projections
Climate Change 1995: Economic and Social Dimensions of Climate Change, Contribution of Working Group III to the Second Assessment Report
Climate Change 2007: the Physical Science Basis. Contribution of Working Group I to the Fourth Assessment Report
Temperature stabilization, ocean heat uptake and radiative forcing overshoot profiles
Clim. Change
Cited by (69)
Sensitivity of simulated soil water content, evapotranspiration, gross primary production and biomass to climate change factors in Euro-Mediterranean grasslands
2023, Agricultural and Forest MeteorologyAssessing parameter sensitivity in a university campus COVID-19 model with vaccinations
2023, Infectious Disease ModellingInteroperability and computational framework for simulating open channel hydraulics: Application to sensitivity analysis and calibration of Gironde Estuary model
2022, Environmental Modelling and SoftwareCitation Excerpt :For this purpose, radial convergence diagrams, also called chord graphs, are used to simultaneously visualize some computed Generalized Sensitivity Indices (Fig. 10). These diagrams plot the main effect of each input variable (first-order multivariate sensitivity analysis proportional to the size of the inner circle); its total influence, including interactions (proportional to the size of the outer circle), existence and extent of second-order effects (second-order multivariate sensitivity indices proportional to width) (Butler et al., 2014). Fig. 10 displays sensitivity analysis estimates from one L2-discrepancy optimized LHS sample used to compute the min-max confidence intervals presented in Table 2.
Global sensitivity analysis for optimal climate policies: Finding what truly matters
2021, Economic ModellingCitation Excerpt :Along with the traditional, variance-based measures, they compute somewhat less common, distribution-based sensitivity measures, which are robust to monotonic transformations of the input-output relationships. Butler et al. (2014) use variance decomposition to analyze the vulnerability of policy scenarios in DICE to uncertainty in a subset of its parameters. Both studies adopt the common practice of generating large Monte Carlo samples of input–output data to estimate sensitivity indices.
Is less more? Experimenting with visual stacking of coincident maps for spatial global sensitivity analysis in urban land-use change modeling
2021, Environmental Modelling and Software