Skip to main content
Log in

On computation of information via variation and inequalities for the entropy function

  • Information Theory
  • Published:
Problems of Information Transmission Aims and scope Submit manuscript

Abstract

This paper supplements the author’s paper [1]. We obtain an explicit formula which in a special case allows us to calculate the maximum of mutual information of several random variables via the variational distance between the joint distribution of these random variables and the product of their marginal distributions. We establish two new inequalities for the binary entropy function, which are related to the problem considered here.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Prelov, V.V., Mutual Information of Several Random Variables and Its Estimation via Variation, Probl. Peredachi Inf., 2009, vol. 45, no. 4, pp. 3–17 [Probl. Inf. Trans. (Engl. Transl.), 2009, vol. 45, no. 4, pp. 295–308].

    MathSciNet  Google Scholar 

  2. Pinsker, M.S., On Estimation of Information via Variation, Probl. Peredachi Inf., 2005, vol. 41, no. 2, pp. 3–8 [Probl. Inf. Trans. (Engl. Transl.), 2005, vol. 41, no. 2, pp. 71–75].

    MathSciNet  Google Scholar 

  3. Prelov, V.V., On Inequalities between Mutual Information and Variation, Probl. Peredachi Inf., 2007, vol. 43, no. 1, pp. 15–27 [Probl. Inf. Trans. (Engl. Transl.), 2007, vol. 43, no. 1, pp. 12–23].

    MathSciNet  Google Scholar 

  4. Prelov, V.V. and van der Meulen, E.C., Mutual Information, Variation, and Fano’s Inequality, Probl. Peredachi Inf., 2008, vol. 44, no. 3, pp. 19–32 [Probl. Inf. Trans. (Engl. Transl.), 2008, vol. 44, no. 3, pp. 185–197].

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to V. V. Prelov.

Additional information

Original Russian Text © V.V. Prelov, 2010, published in Problemy Peredachi Informatsii, 2010, Vol. 46, No. 2, pp. 24–29.

Supported in part by the Russian Foundation for Basic Research, project no. 09-01-00536.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Prelov, V.V. On computation of information via variation and inequalities for the entropy function. Probl Inf Transm 46, 122–126 (2010). https://doi.org/10.1134/S003294601002002X

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S003294601002002X

Keywords

Navigation