Definition
Cross entropy is a concept in information theory to measure the independence of two probability distributions.
Theory
For two distributions p(x) and q(x) defined on the same space, the cross entropy is defined as
where \(H(p) = \mathrm{E}_{p}[-\log p(X)]\) is the entropy of p and \(KL(p,q) = \mathrm{E}_{p}[\log (p(X)/q(X))]\) is the Kullback-Leibler divergence from p to q. The Kullback-Leibler divergence is also called relative entropy.
The cross entropy method is a Monte Carlo method for rare event simulation and stochastic optimization [2].
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Cover TM, Thomas JA (1991) Elements of information theory. Wiley, New York
Rubinstein RY (1997) Optimization of computer simulation models with rare events. Eur J Oper Res 99:89–112
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer Science+Business Media New York
About this entry
Cite this entry
Wu, Y.N. (2014). Cross Entropy. In: Ikeuchi, K. (eds) Computer Vision. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-31439-6_743
Download citation
DOI: https://doi.org/10.1007/978-0-387-31439-6_743
Published:
Publisher Name: Springer, Boston, MA
Print ISBN: 978-0-387-30771-8
Online ISBN: 978-0-387-31439-6
eBook Packages: Computer ScienceReference Module Computer Science and Engineering