Skip to main content
Log in

Concentration Theorems for Entropy and Free Energy

  • Large Systems
  • Published:
Problems of Information Transmission Aims and scope Submit manuscript

Abstract

Jaynes’s entropy concentration theorem states that, for most words ω1 ...ωN of length N such that \(\mathop \Sigma \limits_{i = 1}^{\rm N} \;f(\omega _i ) \approx vN\), empirical frequencies of values of a function f are close to the probabilities that maximize the Shannon entropy given a value v of the mathematical expectation of f. Using the notion of algorithmic entropy, we define the notions of entropy for the Bose and Fermi statistical models of unordered data. New variants of Jaynes’s concentration theorem for these models are proved. We also present some concentration properties for free energy in the case of a nonisolated isothermal system. Exact relations for the algorithmic entropy and free energy at extreme points are obtained. These relations are used to obtain tight bounds on uctuations of energy levels at equilibrium points.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

REFERENCES

  1. Jaynes, E.T., Papers on Probability, Statistics, and Statistical Physics, Dordrecht: Kluwer, 1989.

    Google Scholar 

  2. Cover, T.M. and Thomas, J.A., Elements of Information Theory, New York: Wiley, 1991.

    Google Scholar 

  3. Li, M. and Vitanyi, P., An Introduction to Kolmogorov Complexity and Its Applications, NewYork: Springer, 1997, 2nd ed.

    Google Scholar 

  4. Stratonovich, R.L., Teoriya informatsii (Information Theory), Moscow: Sov. Radio, 1975.

    Google Scholar 

  5. Landau, L.D. and Lifshitz, E.M., Statisticheskaya fizika, Part 1, Moscow: Nauka, 1976. Translated under the title Statistical Physics, vol. 1, Oxford, New York: Pergamon, 1980.

    Google Scholar 

  6. Jaynes, E.T., How Should We Use Entropy in Economics? (Some Half-Baked Ideas in Need of Criticism), unpublished manuscript. Available from http://bayes.wustl.edu/et/etj/articles/entropy.in. economics.pdf.

  7. Maslov, V.P., Integral Equations and Phase Transitions in Probability Games. Analogy with Statistical Physics, Teor. Veroyatn. Primen., 2003, vol. 48, no.2, pp. 403–410 [Theory Probab. Appl. (Engl. Transl.), 2003, vol. 48, no. 2, pp. 359-367].

    Google Scholar 

  8. Kolmogorov, A.N., Three Approaches to the Quantitative Definition of Information, Probl. Peredachi Inf., 1965, vol. 1, no.1, pp. 3–11 [Probl. Inf. Trans. (Engl. Transl.), 1965, vol. 1, no. 1, pp. 1–7].

    Google Scholar 

  9. Kolmogorov, A.N., The Logical Basis for Information Theory and Probability Theory, IEEE Trans. Inform. Theory, 1968, vol. 14, no.3, pp. 662–664.

    Google Scholar 

  10. Zurek, W.H., Algorithmic Randomness and Physical Entropy, Phys. Rev. A, 1989, vol. 40, no.8, pp. 4731–4751.

    Google Scholar 

  11. Rissanen, J., Minimum Description Length Principle, Encyclopaedia of Statistical Sciences, vol. 5, Kotz, S. and Johnson, N.L., Eds., New York: Wiley, 1986, pp. 523–527.

    Google Scholar 

  12. Gacs, P., Tromp, J.,and Vitanyi, P., Algorithmic Statistics, IEEE Trans. Inform. Theory, 2001, vol. 47, no.6, pp. 2443–2463.

    Google Scholar 

  13. V’yugin, V.V. and Maslov, V.P., Extremal Relations between Additive Loss Functions and the Kolmogorov Complexity, Probl. Peredachi Inf., 2003, vol. 39, no.4, pp. 71–87 [Probl. Inf. Trans. (Engl. Transl.), 2003, vol. 39, no. 4, pp. 380–394].

    Google Scholar 

  14. Bogolyubov, N.N., Energy Levels of the Non-Ideal Bose-Einsten Gas, Vestnik Moskov. Univ., 1947, vol. 7, pp. 43–56.

    Google Scholar 

  15. Uspensky, V.A., Semenov, A.L., and Shen’, A.Kh., Can an Individual Sequence of Zeros and Ones Be Random?, Uspekhi Mat. Nauk, 1990, vol. 45, no.1, pp. 105–162 [Russian Math. Surveys (Engl. Transl.), 1990, vol. 45, no. 1, pp. 121–189].

    Google Scholar 

  16. Kolmogorov, A.N. and Uspensky, V.A., Algorithms and Randomness, Teor. Veroyatn. Primen., 1987, vol. 32, no.3, pp. 425–455 [Theory Probab. Appl. (Engl. Transl.), 1987, vol. 32, no. 3, pp. 389–412].

    Google Scholar 

  17. V’yugin, V.V., Algorithmic Complexity and Stochastic Properties of Finite Binary Sequences, Computer J., 1999, vol. 42, no.4, pp. 294–317.

    Google Scholar 

  18. Kolmogorov, A.N., Combinatorial Foundations of Information Theory and the Calculus of Probabilities, Uspekhi Mat. Nauk, 1983, vol. 38, no.4, pp. 27–36 [Russian Math. Surveys (Engl. Transl.), 1983, vol. 38, no. 4, pp. 29–40].

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

__________

Translated from Problemy Peredachi Informatsii, No. 2, 2005, pp. 72–88.

Original Russian Text Copyright © 2005 by V’yugin, Maslov.

Supported in part by Grants of the President of the Russian Federation for Leading Scientific Schools, nos. 358.2003.1 and 1678.2003.1, Russian Foundation for Basic Research, project no. 03-01-00475, and the joint Russian-French RFBR-CNRS grant, no. 02-02-22001.

Rights and permissions

Reprints and permissions

About this article

Cite this article

V’yugin, V.V., Maslov, V.P. Concentration Theorems for Entropy and Free Energy. Probl Inf Transm 41, 134–149 (2005). https://doi.org/10.1007/s11122-005-0019-1

Download citation

  • Received:

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11122-005-0019-1

Keywords

Navigation