MCMC-based local parametric sensitivity estimations

https://doi.org/10.1016/j.csda.2005.09.005Get rights and content

Abstract

Bayesian inferences for complex models need to be made by approximation techniques, mainly by Markov chain Monte Carlo (MCMC) methods. For these models, sensitivity analysis is a difficult task. A novel computationally low-cost approach to estimate local parametric sensitivities in Bayesian models is proposed. This method allows to estimate the sensitivity measures and their errors with the same random sample that has been generated to estimate the quantity of interest. Conditions to allow a derivative-integral interchange in the operator of interest are required. Two illustrative examples have been considered to show how sensitivity computations with respect to the prior distribution and the loss function are easily obtained in practice.

Introduction

Bayesian statistics has become more popular, thanks to the appearance of Markov chain Monte Carlo (MCMC) methods (see Brooks, 1998 for a review and Gilks et al., 1998 for a monograph). The application of these simulation techniques allows to obtain a numerical solution of problems based on really complex models. Sometimes, MCMC methods are the only computationally efficient alternative.

When performing a Bayesian analysis, inferences depend on some input models as the prior distribution, the likelihood or the loss function. Besides the model solution, some description of its sensitivity to the specification of these inputs is necessary. Sensitivity of inferences to the choice of the prior distribution has been widely investigated (see, for example, Berger, 1994). Sivaganesan (1993) and Dey et al. (1996) studied sensitivity with respect to the prior and the likelihood. Martín et al. (1998) considered the loss function and Martín et al. (2003) investigated joint sensitivity with respect to utility and prior distribution. Two relevant monographs on robust Bayesian analysis are provided by Berger et al. (1996) and Ríos Insua and Ruggeri (2000).

Sensitivity analysis can be studied from two viewpoints: local and global. Local sensitivity considers the behavior of posterior quantities of interest under infinitesimal perturbations from a specified input (prior or loss in this paper). On the other hand, global sensitivity quantifies the range of a posterior quantity when the prior (loss) varies in a class. See Sivaganesan (2000) for a comparative review on the local and global approaches to Bayesian robustness.

Sensitivity analyses are demanded by several authors to be applied in complex models that need to be solved by MCMC methods (see, for example, Ríos Insua and Ruggeri, 2000). Some authors, like Richarson and Green (1997), Hall et al. (2003), Halekoh and Vach (2003) study parametric sensitivity by solving the model for some values of the prior parameters. They, basically, re-run the Markov chain for different parameters of the prior distributions and estimate the quantities of interest under those parameter specifications. This is computationally costly and, generally, not enough. Therefore, it would be convenient to develop a general method that can be applied to estimate local sensitivities. This is the issue addressed in the paper.

The outline of the paper is as follows. In Section 2, the problem is described and some relevant results on local sensitivity analysis are summarized. Section 3 focuses on the parametric sensitivity case. A computationally low-cost method to estimate local parametric sensitivities in models solved by MCMC methods is proposed. Section 4 presents two illustrative examples which show how the proposed method is easily applied in practice. A discussion is presented in Section 5. Finally, an appendix containing the proofs is included.

Section snippets

Local sensitivity estimations

Suppose the interest is focused on the estimation of a quantity that can be expressed as the integral of a real valued function f over a multiple dimension domain with respect to a density p, i.e.Θf(θ)p(θ)dθ.When p is the posterior distribution for θ, i.e., p(θ|x), the integral (1) becomes the posterior expectation of f(θ). In this case, the posterior mean is recovered when f is the identity function.

In Bayesian decision theory and inference, the elicitation of the prior distribution π(θ) (or f

Parametric classes

In this section, a parametric class of prior distributions Γ=πλ:λΛ and a parametric class of functions F=fξ:ξΞ are considered. Let Iλ(π,f) and Iξ(π,f) (Iλ and Iξ for short) denote the dependence of the posterior quantity I(π,f) on λ=λ1,,λm and ξ=ξ1,,ξm, respectively.

Firstly, sensitivity with respect to the prior distribution is considered. Assume that a local sensitivity analysis around λ=λ0 is required in the class Γ. Then, Iλ is, as a function of parameters, an operator from Rm to the

Applications

Two illustrative examples are considered in this section: an application to a normal mixture model and a decision problem based on a first-order autoregressive model.

Example 1 Normal mixture model

Bowmaker et al. (1985) analyzed data on the peak sensitivity wavelengths for individual microspectrophotometric records on a small set of monkey's eyes. Part of the analysis involves fitting a mixture of two normal distributions with common variance, so that each observation yi is assumed to be drawn from one of two groups. Let Ti=

Discussion

It is widely recognized that formal sensitivity analysis is a difficult task in Bayesian modeling. Several local sensitivity measures based on functional derivatives have been proposed in the recent literature, for example, the Fréchet derivative and its norm is one of them. In the particular case of parametric classes, the Fréchet derivative is the gradient vector. The method proposed in this paper uses the components of the gradient vector, i.e., the partial derivatives as sensitivity

Acknowledgements

The authors thank a referee and an associate editor for comments and suggestions which have substantially improved the readability and the content of this paper. Discussions with David Ríos are also gratefully acknowledged. This work has been partially supported by Junta de Extremadura, Spain (Project IPR00A075).

References (32)

  • A. Cuevas et al.

    On differentiability properties of Bayes operators

  • Dey, D., Ghosh, S., Lou, K., 1996. On local sensitivity measures in bayesian analysis (with discussion). In: Berger,...
  • P. Diaconis et al.

    On the consistency of Bayes estimates

    Annals of Statistics

    (1986)
  • T.J. DiCiccio et al.

    Computing Bayes factors by combining simulation and asymptotic approximations

    J. Amer. Statist. Assoc.

    (1997)
  • R.M. Dorazio et al.

    Bayesian inference and decision theory—a framework for decision making in natural resource management

    Ecol. Appl.

    (2003)
  • S. French et al.

    Statistical Decision Theory

    (2000)
  • Cited by (26)

    • Adversarial life testing: A Bayesian negotiation model

      2014, Reliability Engineering and System Safety
      Citation Excerpt :

      It would be very interesting to carry out sensitivity analyses for the parameters of both, the prior distribution and the utility functions corresponding to the manufacturer and/or the consumer. Local parametric sensitivity analysis can be performed by using the approach in Pérez et al. [30] and Rufo et al. [34]. Finally, extensions of this study could be performed, by considering other quality measures for the product different from the lifetime or by considering a quality control-based approach.

    • Measuring prior sensitivity and prior informativeness in large Bayesian models

      2012, Journal of Monetary Economics
      Citation Excerpt :

      Berger (1994), Gustafson (2000) and Sivaganesan (2000) provide overviews and references. More specifically, Basu et al. (1996), Geweke (1999) and Perez et al. (2006) study the local sensitivity of the posterior mean in a parametric class of priors, which amounts to the computation of the posterior mean derivative with respect to the prior hyperparameter. Millar (2004) observes that if the scalar marginal prior distribution is in the exponential family, then the derivative with respect to the prior mean is simply given by the ratio of the posterior to prior variance.

    • Bayesian analysis of a generalized lognormal distribution

      2009, Computational Statistics and Data Analysis
      Citation Excerpt :

      Sensitivity analysis is an important task in this context. Pérez et al. (2006) presented a computationally low-cost approach to estimate local parametric sensitivities for models solved by MCMC-methods. The following step is to choose the hyperparameter values for the prior distributions.

    View all citing articles on Scopus
    View full text