Skip to main content

Advertisement

Log in

Posterior inference in the random intercept model based on samples obtained with Markov chain Monte Carlo methods

  • Published:
Computational Statistics Aims and scope Submit manuscript

Summary

Many papers (including most of the papers in this issue of Computational Statistics) deal with Markov Chain Monte Carlo (MCMC) methods. This paper will give an introduction to the augmented Gibbs sampler (a special case of MCMC), illustrated using the random intercept model. A’ nonstandard’ application of the augmented Gibbs sampler will be discussed to give an illustration of the power of MCMC methods. Furthermore, it will be illustrated that the posterior sample resulting from an application of MCMC can be used for more than determination of convergence and the computation of simple estimators like the a posteriori expectation and standard deviation. Posterior samples give access to many other inferential possibilities. Using a simulation study, the frequency properties of some of these possibilities will be evaluated.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Figure 1
Figure 2
Figure 3
Figure 4
Figure 5

Similar content being viewed by others

Explore related subjects

Discover the latest articles and news from researchers in related subjects, suggested using machine learning.

References

  • Box, G.E.P. and Tiao, C. (1973). Bayesian Inference in Statistical Analysis. London: Addison-Wesley.

    MATH  Google Scholar 

  • Browne, W.J. (1998). Applying MCMC Methods to Multi-level Models. Unpublished doctoral dissertation, University of Bath, United Kingdom.

  • Bryk, A.S. and Raudenbush S.W. (1992). Hierarchical Linear Models: Applications and Data Analysis Methods. London: SAGE. tem

    Google Scholar 

  • Carlin, B.P. and Louis, T.A. (1996). Bayes and Empirical Bayes Methods for Data Analysis. London: Chapman and Hall.

    MATH  Google Scholar 

  • Casella, G. and George, E. (1992). Explaining the Gibbs sampler. The American Statistician, 46, 167–174.

    MathSciNet  Google Scholar 

  • Chib, S. and Greenberg, E. (1995). Understanding the Metropolis-Hastings Algorithm. The American Statistician, 49, 327–335.

    Google Scholar 

  • Cowles, M.K. and Carlin B.P. (1996). Markov chain Monte Carlo convergence diagnostics: A comparative review. Journal of the American Statistical Association, 91, 883–904.

    Article  MathSciNet  Google Scholar 

  • Dempster, A.P., Laird, N.M. and Rubin, D.B. (1977). Maximum likelihood estimation from incomplete data via the EM algorithm. Journal of the Royal Statistical Society, Series B, 39, 1–38.

    MathSciNet  MATH  Google Scholar 

  • DiCiccio, T.J., Kass, R.E., Raftery, A. and Wasserman, L. (1997). Computing Bayes factors by combining simulation and asymptotic approximations. Journal of the American Statistical Association, 92, 903–915.

    Article  MathSciNet  Google Scholar 

  • Gelfand, A.E., Hills, S.E., Racine-Poon, A. and Smith, A.F.M. (1990). Illustration of Bayesian inference in normal data models using Gibbs sampling. Journal of the American Statistical Association, 85, 972–985.

    Article  Google Scholar 

  • Gelfand, A.E. and Smith, A.F.M. (1990). Sampling-based approaches to calculating marginal densities. Journal of the American Statistical Association, 85, 398–409.

    Article  MathSciNet  Google Scholar 

  • Gelman, A., Carlin, J.B., Stern, H.S. and Rubin, D.B. (1995). Bayesian Data Analysis. Chapman and Hall, London.

    Book  Google Scholar 

  • Gelman, A., Meng, X. and Stern, H.S. (1996). Posterior predictive assessment of model fitness via realized discrepancies (with discussion). Statistica Sinica, 6, 733–760.

    MathSciNet  MATH  Google Scholar 

  • Goldstein, H. (1987). Multilevel Models in Educational and Social Research. London: Charles Griffin.

    Google Scholar 

  • Goldstein, H. and Spiegelhalter, D.J. (1996). League tables and their limitations: statistical issues in comparisons of institutional performance (with discussion). Journal of the Royal Statistical Society, Series A, 159, 385–444.

    Article  Google Scholar 

  • Hobert, J.P. and Casella, G. (1996). The effect of improper priors on Gibbs sampling in hierarchical linear mixed models. Journal of the American Statistical Association, 91, 1461–1473.

    Article  MathSciNet  Google Scholar 

  • Hoijtink, H. (1998). Constrained latent class analysis using the Gibbs sampler and posterior predictive p-values: applications to educational testing. Statistica Sinica, 8, 691–711.

    MathSciNet  MATH  Google Scholar 

  • Kass, R.E. and Raftery, A.E. (1995). Bayes factors. Journal of the American Statistical Association, 90, 773–795.

    Article  MathSciNet  Google Scholar 

  • Meng, X.L. (1994). Posterior Predictive p-Values. The Annals of Statistics, 22, 1142–1160.

    Article  MathSciNet  Google Scholar 

  • Mortimore, P., Sammons, P., Stoll, L., Lewis, D. and Ecob, R. (1988). School Matters, the Junior Years. Wells, Open Books.

    Google Scholar 

  • Multilevel Models Project (1991). ML3 Data Library. London: Multilevel Models Project, Institute of Education, University of London.

    Google Scholar 

  • Rubin, D.B. (1984). Bayesian justifiable and relevant frequency calculations for the applied statistician. The Annals of Statistics, 12, 1151–1172.

    Article  MathSciNet  Google Scholar 

  • Self, S.G. and Liang, K.Y. (1987). Asymptotic properties of maximum likelihood estimators and likelihood ratio tests under nonstandard conditions. Journal of the American Statistical Association, 82, 605–610.

    Article  MathSciNet  Google Scholar 

  • Smith, A.F.M. and Roberts, G.O. (1993). Bayesian computation via the Gibbs sampler and related Markov Chain Monte Carlo methods. Journal of the Royal Statistical Society, Series B, 55, 3–24.

    MathSciNet  MATH  Google Scholar 

  • Spiegelhalter, D., Thomas, A., Best, N. and Gilks, W. (1996a). BUGS 0.5 Bayesian Inference Using Gibbs Sampling. Manual (version ii).

  • Spiegelhalter, D., Thomas, A., Best, N. and Gilks, W. (1996b). BUGS 0.5 Examples. Volume 1 (version i).

  • Tanner, M.A. (1993). Tools for Statistical Inference. Springer, New York.

    Book  Google Scholar 

  • Tanner, M.A. and Wong, W.H. (1987). The calculation of posterior distributions by data augmentation. Journal of the American Statistical Association, 82, 528–540.

    Article  MathSciNet  Google Scholar 

  • Tierney, L. (1994). Markov chains for exploring posterior distributions (with discussion). Annals of Statistics, 22, 1701–1762.

    Article  MathSciNet  Google Scholar 

  • Zeger, S.L. and Karim, M.R. (1991). Generalized linear models with random effect; a Gibbs sampling approach. Journal of the American Statistical Association, 86, 79–86.

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Hoijtink, H. Posterior inference in the random intercept model based on samples obtained with Markov chain Monte Carlo methods. Computational Statistics 15, 315–336 (2000). https://doi.org/10.1007/s001800000037

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s001800000037

Keywords