Abstract
Markov Chain Monte Carlo techniques are used to generate samples that closely approximate a given multivariate probability distribution, with the function not having to be normalised in the case of certain algorithms such as Metropolis-Hastings. As with other Monte Carlo techniques, MCMC employs repeated random sampling to exploit the law of large numbers. Samples are generated by running a Markov Chain, which is created such that its stationary distribution follows the input function, for which a proposal distribution is used. This approach may be used for optimization tasks, for approximating solutions to non-deterministic polynomial time problems, for estimating integrals using importance sampling, and for cryptographic decoding. This paper serves as an introduction to the MCMC techniques and some of its applications.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bishop, C.M.: Pattern Recognition and Machine Learning (Information Science and Statistics). Springer-Verlag, Berlin, Heidelberg (2006)
Brémaud, P.: Markov Chains: Gibbs Fields, Monte Carlo Simulation, and Queues, vol. 31. Springer, New York (2013). https://doi.org/10.1007/978-3-030-45982-6
Chen, J., Rosenthal, J.S.: Decrypting classical cipher text using Markov chain Monte Carlo. Stat. Comput. 22(2), 397–413 (2012). https://doi.org/10.1007/s11222-011-9232-5
Chib, S., Greenberg, E.: Understanding the metropolis-hastings algorithm. Am. Stat. 49(4), 327–335 (1995). https://doi.org/10.1080/00031305.1995.10476177
Cipra, B.A.: The best of the 20th century: editors name top 10 algorithms. SIAM News 33(4), 1–2 (2000)
Diaconis, P.: The Markov chain Monte Carlo revolution. Bull. Am. Math. Soc. 46(2), 179–205 (2009)
Gelfand, A.E.: Gibbs sampling. J. Am. Stat. Assoc. 95(452), 1300–1304 (2000). https://doi.org/10.1080/01621459.2000.10474335
Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Mach. Intell. PAMI 6(6), 721–741 (1984). https://doi.org/10.1109/TPAMI.1984.4767596
Goodfellow, I., Bengio, Y., Courville, A.: Deep Learning. MIT press, Cambridge (2016)
Haasteren, R.V.: Marginal likelihood calculation with MCMC methods. In: Gravitational Wave Detection and Data Analysis for Pulsar Timing Arrays, pp. 99–120. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-642-39599-4_5
Hastings, W.K.: Monte Carlo sampling methods using Markov chains and their applications. Biometrika 57, 97–109 (1970)
Huang, H., Yang, W.: Strong law of large numbers for Markov chains indexed by an infinite tree with uniformly bounded degree. Sci. China Ser. A: Math. 51(2), 195–202 (2008)
Kaji, T., Ročková, V.: Metropolis-hastings via classification. J. Am. Stat. Assoc., 1–33 (2022). https://doi.org/10.1080/01621459.2022.2060836
Karras, C., Karras, A.: DBSOP: an efficient heuristic for speedy MCMC sampling on polytopes. arXiv preprint arXiv:2203.10916 (2022). https://doi.org/10.48550/arXiv.2203.10916
Karras, C., Karras, A., Sioutas, S.: Pattern recognition and event detection on IoT data-streams. arXiv preprint arXiv:2203.01114 (2022). https://doi.org/10.48550/arXiv.2203.01114
Kirkpatrick, S., Gelatt, C.D., Jr., Vecchi, M.P.: Optimization by simulated annealing. Science 220(4598), 671–680 (1983)
Martino, L., Elvira, V., Luengo, D., Corander, J., Louzada, F.: Orthogonal parallel MCMC methods for sampling and optimization. Digital Signal Process. 58, 64–84 (2016). https://doi.org/10.1016/j.dsp.2016.07.013
Metropolis, N., Rosenbluth, A.W., Rosenbluth, M.N., Teller, A.H., Teller, E.: Equation of state calculations by fast computing machines. J. Chem. Phys. 21(6), 1087–1092 (1953)
Murphy, K.P.: Machine Learning: A Probabilistic Perspective. MIT press, Cambridge (2012)
Revuz, D.: Markov Chains. Elsevier, Amsterdam (2008)
Ripley, B.D.: Stochastic Simulation. John Wiley & Sons, Hoboken (2009)
Wolfinger, R., O’connell, M.: Generalized linear mixed models a pseudo-likelihood approach. J. Stat. Comput. Simul. 48(3–4), 233–243 (1993). https://doi.org/10.1080/00949659308811554
Xu, J.-G., Zhao, Y., Chen, J., Han, C.: A structure learning algorithm for bayesian network using prior knowledge. J. Comput. Sci. Technol. 30(4), 713–724 (2015). https://doi.org/10.1007/s11390-015-1556-8
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 IFIP International Federation for Information Processing
About this paper
Cite this paper
Karras, C., Karras, A., Avlonitis, M., Sioutas, S. (2022). An Overview of MCMC Methods: From Theory to Applications. In: Maglogiannis, I., Iliadis, L., Macintyre, J., Cortez, P. (eds) Artificial Intelligence Applications and Innovations. AIAI 2022 IFIP WG 12.5 International Workshops. AIAI 2022. IFIP Advances in Information and Communication Technology, vol 652. Springer, Cham. https://doi.org/10.1007/978-3-031-08341-9_26
Download citation
DOI: https://doi.org/10.1007/978-3-031-08341-9_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-08340-2
Online ISBN: 978-3-031-08341-9
eBook Packages: Computer ScienceComputer Science (R0)