Skip to main content

Bayesian Methods

  • Reference work entry
  • First Online:
  • 286 Accesses

Definition

The two most important concepts used in Bayesian modeling are probability and utility. Probabilities are used to model our belief about the state of the world and utilities are used to model the value to us of different outcomes, thus to model costs and benefits. Probabilities are represented in the form of \(p(x\vert C)\), where C is the current known context and x is some event(s) of interest from a space χ. The left and right arguments of the probability function are in general propositions (in the logical sense). Probabilities are updated based on new evidence or outcomes y using Bayes rule, which takes the form

$$\displaystyle\begin{array}{rcl} p(x\vert C,y) = \frac{p(x\vert C)p(Y \vert x,C)} {p(y\vert C)},& & {}\\ \end{array}$$

where χ is the discrete domain of x. More generally, any measurable set can be used for the domain χ. An integral or mixed sum and integral can replace the sum. For a utility function u(x) of some event x, for instance the benefit of a...

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   699.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   949.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    A good introduction to the problems of uncertainty and philosophical issues behind the Bayesian treatment of probability is in Lindley (2006). From the statistical machine learning perspective, a good introductory text is by MacKay (2003) who carefully covers information theory, probability, and inference but not so much statistical machine learning. Another alternative introduction to probabilities is the posthumously completed and published work of Jaynes (2003). Discussions from the frequentist versus Bayesian battlefront can be found in works such as Rosenkrantz (1983), and in the approximate artificial intelligence versus probabilistic battlefront in discussion articles such as Cheeseman’s (1988) and the many responses and rebuttals. It should be noted that it is the continued success in applications that have really led these methods into the mainstream, not the entertaining polemics. Good mathematical statistics text books, such as Casella and Berger (2001) cover the breadth of statistical methods and therefore handle basic Bayesian theory. A more comprehensive treatment is given in Bayesian texts such as Gelman et al. (2003). Most advanced statistical machine learning text books cover Bayesian methods, but to fully understand the subtleties of prior beliefs and Bayesian methodology one needs to view more advanced Bayesian literature. A detailed theoretical reference for Bayesian methods is Bernardo and Smith (1994).

Recommended Reading

  • Bernardo J, Smith A (1994) Bayesian theory. Wiley, Chichester

    Book  MATH  Google Scholar 

  • Casella G, Berger R (2001) Statistical inference, 2nd edn. Duxbury, Pacific Grove

    MATH  Google Scholar 

  • Cheeseman P (1988) An inquiry into computer understanding. Comput Intell 4(1):58–66

    Article  Google Scholar 

  • Gelman A, Carlin J, Stern H, Rubin D (2003) Bayesian data analysis, 2nd edn. Chapman & Hall/CRC Press, Boca Raton

    MATH  Google Scholar 

  • Horvitz E, Heckerman D, Langlotz C (1986) A framework for comparing alternative formalisms for plausible reasoning. In: Fifth national conference on artificial intelligence, Philadelphia, pp 210–214

    Google Scholar 

  • Jaynes E (2003) Probability theory: the logic of science. Cambridge University Press, New York

    Book  MATH  Google Scholar 

  • Lindley D (2006) Understanding uncertainty. Wiley, Hoboken

    Book  MATH  Google Scholar 

  • MacKay D (2003) Information theory, inference, and learning algorithms. Cambridge University Press, Cambridge

    MATH  Google Scholar 

  • Rosenkrantz R (ed) (1983) E.T. Jaynes: papers on probability, statistics and statistical physics. D. Reidel, Dordrecht

    Google Scholar 

  • Wainwright MJ, Jordan MI (2008) Graphical models, exponential families, and variational inference. Now Publishers, Hanover

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer Science+Business Media New York

About this entry

Cite this entry

Buntine, W.L. (2017). Bayesian Methods. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning and Data Mining. Springer, Boston, MA. https://doi.org/10.1007/978-1-4899-7687-1_63

Download citation

Publish with us

Policies and ethics