Abstract
We obtain some upper and lower bounds for the maximum of mutual information of several random variables via variational distance between the joint distribution of these random variables and the product of its marginal distributions. In this connection, some properties of variational distance between probability distributions of this type are derived. We show that in some special cases estimates of the maximum of mutual information obtained here are optimal or asymptotically optimal. Some results of this paper generalize the corresponding results of [1–3] to the multivariate case.
Similar content being viewed by others
References
Pinsker, M.S., On Estimation of Information via Variation, Probl. Peredachi Inf., 2005, vol. 41, no. 2, pp. 3–8 [Probl. Inf. Trans. (Engl. Transl.), 2005, vol. 41, no. 2, pp. 71–75].
Prelov, V.V., On Inequalities between Mutual Information and Variation, Probl. Peredachi Inf., 2007, vol. 43, no. 1, pp. 15–27 [Probl. Inf. Trans. (Engl. Transl.), 2007, vol. 43, no. 1, pp. 12–23].
Prelov, V.V. and van der Meulen, E.C., Mutual Information, Variation, and Fano’s Inequality, Probl. Peredachi Inf., 2008, vol. 44, no. 3, pp. 19–32 [Probl. Inf. Trans. (Engl. Transl.), 2008, vol. 44, no. 3, pp. 185–197].
Fedotov, A.A., Harremöes, P., and Topsøe, F., Refinements of Pinsker’s Inequality, IEEE Trans. Inform. Theory, 2003, vol. 49, no. 6, pp. 1491–1498.
Pinsker, M.S., Informatsiya i informatsionnaya ustoichivost’ sluchainykh velichin i protsessov, Probl. Peredachi Inf., issue 7, Moscow: Akad. Nauk SSSR, 1960. Translated under the title Information and Information Stability of Random Variables and Processes, San Francisco: Holden-Day, 1964.
Csiszár, I. and Körner, J., Information Theory: Coding Theorems for Discrete Memoryless Systems, New York: Academic; Budapest: Akad. Kiadó, 1981. Translated under the title Teoriya informatsii: teoremy kodirovaniya dlya diskretnykh sistem bez pamyati, Moscow: Mir, 1985.
Zhang, Z., Estimating Mutual Information via Kolmogorov Distance, IEEE Trans. Inform. Theory, 2007, vol. 53, no. 9, pp. 3280–3282.
Author information
Authors and Affiliations
Corresponding author
Additional information
Original Russian Text © V.V. Prelov, 2009, published in Problemy Peredachi Informatsii, 2009, Vol. 45, No. 4, pp. 3–17.
Supported in part by the Russian Foundation for Basic Research, project no. 09-01-00536.
Rights and permissions
About this article
Cite this article
Prelov, V.V. Mutual information of several random variables and its estimation via variation. Probl Inf Transm 45, 295–308 (2009). https://doi.org/10.1134/S0032946009040012
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S0032946009040012