Abstract:
It is well-known that maximum entropy distributions, subject to appropriate moment constraints, arise in physics and mathematics. In an attempt to find a physical reason ...Show MoreMetadata
Abstract:
It is well-known that maximum entropy distributions, subject to appropriate moment constraints, arise in physics and mathematics. In an attempt to find a physical reason for the appearance of maximum entropy distributions, the following theorem is offered. The conditional distribution ofX_{l}given the empirical observation(1/n)\sum^{n}_{i}=_{l}h(X_{i})=\alpha, whereX_{1},X_{2}, \cdotsare independent identically distributed random variables with common densitygconverges tof_{\lambda}(x)=e^{\lambda^{t}h(X)}g(x)(Suitably normalized), where\lambdais chosen to satisfy\int f_{lambda}(x)h(x)dx= \alpha. Thus the conditional distribution of a given random variableXis the (normalized) product of the maximum entropy distribution and the initial distribution. This distribution is the maximum entropy distribution whengis uniform. The proof of this and related results relies heavily on the work of Zabell and Lanford.
Published in: IEEE Transactions on Information Theory ( Volume: 27, Issue: 4, July 1981)