Abstract
This paper supplements the author’s paper [1]. We obtain an explicit formula which in a special case allows us to calculate the maximum of mutual information of several random variables via the variational distance between the joint distribution of these random variables and the product of their marginal distributions. We establish two new inequalities for the binary entropy function, which are related to the problem considered here.
Similar content being viewed by others
References
Prelov, V.V., Mutual Information of Several Random Variables and Its Estimation via Variation, Probl. Peredachi Inf., 2009, vol. 45, no. 4, pp. 3–17 [Probl. Inf. Trans. (Engl. Transl.), 2009, vol. 45, no. 4, pp. 295–308].
Pinsker, M.S., On Estimation of Information via Variation, Probl. Peredachi Inf., 2005, vol. 41, no. 2, pp. 3–8 [Probl. Inf. Trans. (Engl. Transl.), 2005, vol. 41, no. 2, pp. 71–75].
Prelov, V.V., On Inequalities between Mutual Information and Variation, Probl. Peredachi Inf., 2007, vol. 43, no. 1, pp. 15–27 [Probl. Inf. Trans. (Engl. Transl.), 2007, vol. 43, no. 1, pp. 12–23].
Prelov, V.V. and van der Meulen, E.C., Mutual Information, Variation, and Fano’s Inequality, Probl. Peredachi Inf., 2008, vol. 44, no. 3, pp. 19–32 [Probl. Inf. Trans. (Engl. Transl.), 2008, vol. 44, no. 3, pp. 185–197].
Author information
Authors and Affiliations
Corresponding author
Additional information
Original Russian Text © V.V. Prelov, 2010, published in Problemy Peredachi Informatsii, 2010, Vol. 46, No. 2, pp. 24–29.
Supported in part by the Russian Foundation for Basic Research, project no. 09-01-00536.
Rights and permissions
About this article
Cite this article
Prelov, V.V. On computation of information via variation and inequalities for the entropy function. Probl Inf Transm 46, 122–126 (2010). https://doi.org/10.1134/S003294601002002X
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S003294601002002X