MCMC-based local parametric sensitivity estimations
Introduction
Bayesian statistics has become more popular, thanks to the appearance of Markov chain Monte Carlo (MCMC) methods (see Brooks, 1998 for a review and Gilks et al., 1998 for a monograph). The application of these simulation techniques allows to obtain a numerical solution of problems based on really complex models. Sometimes, MCMC methods are the only computationally efficient alternative.
When performing a Bayesian analysis, inferences depend on some input models as the prior distribution, the likelihood or the loss function. Besides the model solution, some description of its sensitivity to the specification of these inputs is necessary. Sensitivity of inferences to the choice of the prior distribution has been widely investigated (see, for example, Berger, 1994). Sivaganesan (1993) and Dey et al. (1996) studied sensitivity with respect to the prior and the likelihood. Martín et al. (1998) considered the loss function and Martín et al. (2003) investigated joint sensitivity with respect to utility and prior distribution. Two relevant monographs on robust Bayesian analysis are provided by Berger et al. (1996) and Ríos Insua and Ruggeri (2000).
Sensitivity analysis can be studied from two viewpoints: local and global. Local sensitivity considers the behavior of posterior quantities of interest under infinitesimal perturbations from a specified input (prior or loss in this paper). On the other hand, global sensitivity quantifies the range of a posterior quantity when the prior (loss) varies in a class. See Sivaganesan (2000) for a comparative review on the local and global approaches to Bayesian robustness.
Sensitivity analyses are demanded by several authors to be applied in complex models that need to be solved by MCMC methods (see, for example, Ríos Insua and Ruggeri, 2000). Some authors, like Richarson and Green (1997), Hall et al. (2003), Halekoh and Vach (2003) study parametric sensitivity by solving the model for some values of the prior parameters. They, basically, re-run the Markov chain for different parameters of the prior distributions and estimate the quantities of interest under those parameter specifications. This is computationally costly and, generally, not enough. Therefore, it would be convenient to develop a general method that can be applied to estimate local sensitivities. This is the issue addressed in the paper.
The outline of the paper is as follows. In Section 2, the problem is described and some relevant results on local sensitivity analysis are summarized. Section 3 focuses on the parametric sensitivity case. A computationally low-cost method to estimate local parametric sensitivities in models solved by MCMC methods is proposed. Section 4 presents two illustrative examples which show how the proposed method is easily applied in practice. A discussion is presented in Section 5. Finally, an appendix containing the proofs is included.
Section snippets
Local sensitivity estimations
Suppose the interest is focused on the estimation of a quantity that can be expressed as the integral of a real valued function f over a multiple dimension domain with respect to a density p, i.e.When p is the posterior distribution for , i.e., , the integral (1) becomes the posterior expectation of . In this case, the posterior mean is recovered when f is the identity function.
In Bayesian decision theory and inference, the elicitation of the prior distribution (or
Parametric classes
In this section, a parametric class of prior distributions and a parametric class of functions are considered. Let and ( and for short) denote the dependence of the posterior quantity on and , respectively.
Firstly, sensitivity with respect to the prior distribution is considered. Assume that a local sensitivity analysis around is required in the class . Then, is, as a function of parameters, an operator from to the
Applications
Two illustrative examples are considered in this section: an application to a normal mixture model and a decision problem based on a first-order autoregressive model. Example 1 Normal mixture model Bowmaker et al. (1985) analyzed data on the peak sensitivity wavelengths for individual microspectrophotometric records on a small set of monkey's eyes. Part of the analysis involves fitting a mixture of two normal distributions with common variance, so that each observation is assumed to be drawn from one of two groups. Let
Discussion
It is widely recognized that formal sensitivity analysis is a difficult task in Bayesian modeling. Several local sensitivity measures based on functional derivatives have been proposed in the recent literature, for example, the Fréchet derivative and its norm is one of them. In the particular case of parametric classes, the Fréchet derivative is the gradient vector. The method proposed in this paper uses the components of the gradient vector, i.e., the partial derivatives as sensitivity
Acknowledgements
The authors thank a referee and an associate editor for comments and suggestions which have substantially improved the readability and the content of this paper. Discussions with David Ríos are also gratefully acknowledged. This work has been partially supported by Junta de Extremadura, Spain (Project IPR00A075).
References (32)
- et al.
Two types of trichromatic squirrel monkey share pigment in the red-green spectral region
Vision Res.
(1985) - et al.
Bayesian and profile likelihood change point methods for modeling cognitive function over time
Comput. Statist. Data Anal.
(2003) - et al.
Density based classes of priors: infinitesimal properties and approximations
J. Statist. Plann. Inference
(1995) Robust Bayesian diagnostics
J. Statist. Plann. Inference
(1993)Statistical Decision Theory and Bayesian Analysis
(1985)An overview of robust Bayesian analysis
Test
(1994)- Berger, J.O., Betro, B., Moreno, E., Pericchi, L.R., Ruggeri, F., Salinetti, G., Wasserman, L.e., 1996. Bayesian...
- et al.
Robust Bayesian analysis
- Best, N.G., Cowles, M.K., Vines, S.K., 1997. Coda: convergence diagnosis and output analysis software for Gibbs...
Markov chain Monte Carlo method and its application
The Statistician
(1998)
On differentiability properties of Bayes operators
On the consistency of Bayes estimates
Annals of Statistics
Computing Bayes factors by combining simulation and asymptotic approximations
J. Amer. Statist. Assoc.
Bayesian inference and decision theory—a framework for decision making in natural resource management
Ecol. Appl.
Statistical Decision Theory
Cited by (26)
An in-depth analysis of Markov-Chain Monte Carlo ensemble samplers for inverse vadose zone modeling
2023, Journal of HydrologyAdversarial life testing: A Bayesian negotiation model
2014, Reliability Engineering and System SafetyCitation Excerpt :It would be very interesting to carry out sensitivity analyses for the parameters of both, the prior distribution and the utility functions corresponding to the manufacturer and/or the consumer. Local parametric sensitivity analysis can be performed by using the approach in Pérez et al. [30] and Rufo et al. [34]. Finally, extensions of this study could be performed, by considering other quality measures for the product different from the lifetime or by considering a quality control-based approach.
Bayesian influence analysis of generalized partial linear mixed models for longitudinal data
2014, Journal of Multivariate AnalysisMeasuring prior sensitivity and prior informativeness in large Bayesian models
2012, Journal of Monetary EconomicsCitation Excerpt :Berger (1994), Gustafson (2000) and Sivaganesan (2000) provide overviews and references. More specifically, Basu et al. (1996), Geweke (1999) and Perez et al. (2006) study the local sensitivity of the posterior mean in a parametric class of priors, which amounts to the computation of the posterior mean derivative with respect to the prior hyperparameter. Millar (2004) observes that if the scalar marginal prior distribution is in the exponential family, then the derivative with respect to the prior mean is simply given by the ratio of the posterior to prior variance.
Bayesian analysis of a generalized lognormal distribution
2009, Computational Statistics and Data AnalysisCitation Excerpt :Sensitivity analysis is an important task in this context. Pérez et al. (2006) presented a computationally low-cost approach to estimate local parametric sensitivities for models solved by MCMC-methods. The following step is to choose the hyperparameter values for the prior distributions.
Bayesian robustness for decision making problems: Applications in medical contexts
2009, International Journal of Approximate Reasoning