Skip to main content
Log in

Mutual information of several random variables and its estimation via variation

  • Information Theory
  • Published:
Problems of Information Transmission Aims and scope Submit manuscript

Abstract

We obtain some upper and lower bounds for the maximum of mutual information of several random variables via variational distance between the joint distribution of these random variables and the product of its marginal distributions. In this connection, some properties of variational distance between probability distributions of this type are derived. We show that in some special cases estimates of the maximum of mutual information obtained here are optimal or asymptotically optimal. Some results of this paper generalize the corresponding results of [1–3] to the multivariate case.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Pinsker, M.S., On Estimation of Information via Variation, Probl. Peredachi Inf., 2005, vol. 41, no. 2, pp. 3–8 [Probl. Inf. Trans. (Engl. Transl.), 2005, vol. 41, no. 2, pp. 71–75].

    MathSciNet  Google Scholar 

  2. Prelov, V.V., On Inequalities between Mutual Information and Variation, Probl. Peredachi Inf., 2007, vol. 43, no. 1, pp. 15–27 [Probl. Inf. Trans. (Engl. Transl.), 2007, vol. 43, no. 1, pp. 12–23].

    MathSciNet  Google Scholar 

  3. Prelov, V.V. and van der Meulen, E.C., Mutual Information, Variation, and Fano’s Inequality, Probl. Peredachi Inf., 2008, vol. 44, no. 3, pp. 19–32 [Probl. Inf. Trans. (Engl. Transl.), 2008, vol. 44, no. 3, pp. 185–197].

    Google Scholar 

  4. Fedotov, A.A., Harremöes, P., and Topsøe, F., Refinements of Pinsker’s Inequality, IEEE Trans. Inform. Theory, 2003, vol. 49, no. 6, pp. 1491–1498.

    Article  MATH  MathSciNet  Google Scholar 

  5. Pinsker, M.S., Informatsiya i informatsionnaya ustoichivost’ sluchainykh velichin i protsessov, Probl. Peredachi Inf., issue 7, Moscow: Akad. Nauk SSSR, 1960. Translated under the title Information and Information Stability of Random Variables and Processes, San Francisco: Holden-Day, 1964.

    Google Scholar 

  6. Csiszár, I. and Körner, J., Information Theory: Coding Theorems for Discrete Memoryless Systems, New York: Academic; Budapest: Akad. Kiadó, 1981. Translated under the title Teoriya informatsii: teoremy kodirovaniya dlya diskretnykh sistem bez pamyati, Moscow: Mir, 1985.

  7. Zhang, Z., Estimating Mutual Information via Kolmogorov Distance, IEEE Trans. Inform. Theory, 2007, vol. 53, no. 9, pp. 3280–3282.

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to V. V. Prelov.

Additional information

Original Russian Text © V.V. Prelov, 2009, published in Problemy Peredachi Informatsii, 2009, Vol. 45, No. 4, pp. 3–17.

Supported in part by the Russian Foundation for Basic Research, project no. 09-01-00536.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Prelov, V.V. Mutual information of several random variables and its estimation via variation. Probl Inf Transm 45, 295–308 (2009). https://doi.org/10.1134/S0032946009040012

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S0032946009040012

Keywords

Navigation