Abstract
We consider the problem of finding maximum values of f-divergences Df(P ∥ Q) of discrete probability distributions P and Q with values on a finite set under the condition that the variation distance V(P, Q) between them and one of the distributions P or Q are given. We obtain exact expressions for such maxima of f-divergences, which in a number of cases allow to obtain both explicit formulas and upper bounds for them. As a consequence, we obtain explicit expressions for the maxima of f-divergences Df (P ∥ Q) given that, besides V(P, Q), we only know the value of the maximum component of either P or Q. Analogous results are also obtained for the Rényi divergence.
Similar content being viewed by others
References
Csiszár, I., Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Bewis der Ergodizität von Markoffschen Ketten, Magyar Tud. Akad. Mat. Kutató Int. Közl. Ser. A, 1963, vol. 8, pp. 85–108.
Csiszár, I., Information-type Measures of Difference of Probability Distributions and Indirect Observations, Studia Sci. Math. Hungar., 1967, vol. 2, no. 3–4, pp. 299–318.
Ali, S.M. and Silvey, S.D., A General Class of Coefficients of Divergence of One Distribution from Another, J. Roy. Statist. Soc. Ser. B, 1966, vol. 28, no. 1, pp. 131–142.
Ziv, J. and Zakai, M., On Functionals Satisfying a Data-Processing Theorem, IEEE Trans. Inform. Theory, 1973, vol. 19, no. 3, pp. 275–283.
Rényi, A., On Measures of Entropy and Information, Proc. 4th Berkeley Sympos. on Mathematical Statistics and Probability, Berkely, CA, USA, June 20–July 30, 1960, Neyman, J., Ed., Berkely: Univ. of California Press, 1961, vol. 1: Contributions to the Theory of Statistics, pp. 547–561.
van Erven, T. and Harremoës, P., Rényi Divergence and Kullback-Leibler Divergence, IEEE Trans. Inform. Theory, 2014, vol. 60, no. 7, pp. 3797–3820.
Liese, F. and Vajda, I., On Divergences and Information in Statistics and Information Theory, IEEE Trans. Inform. Theory, 2006, vol. 52, no. 10, pp. 4394–4412.
Aczel, J. and Daróczy, Z., On Measures of Information and Their Characterization, New York: Academic, 1975.
Gilardoni, G.L., On Pinsker’s and Vajda’s Type Inequalities for Csiszár’s f-Divergences, IEEE Trans. Inform. Theory, 2010, vol. 56, no. 11, pp. 5377–5386.
Guntuboyina, A., Saha, S., and Schiebinger, G., Sharp Inequalities for f-Divergences, IEEE Trans. Inform. Theory, 2014, vol. 60, no. 1, pp. 104–121.
Sason, I., Tight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding, IEEE Trans. Inform. Theory, 2015, vol. 61, no. 2, pp. 701–707.
Sason, I. and Verdú, S., f-Divergence Inequalities, IEEE Trans. Inform. Theory, 2016, vol. 62, no. 11, pp. 5973–6006.
Gilardoni, G.L., On the Minimum f-Divergence for Given Total Variation, C. R. Acad. Sci. Paris Ser. I, 2006, vol. 343, no. 11–12, pp. 763–766.
Prelov, V.V., Optimal Upper Bounds for the Divergence of Finite-Dimensional Distributions under a Given Variational Distance, Probl. Peredachi Inf., 2019, vol. 55, no. 3, pp. 21–29 [Probl. Inf. Transm. (Engl. Transl.), 2019, vol. 55, no. 3, pp. 218–225].
Sason, I. and Verdú, S., Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets, in Proc. 2015 IEEE Information Theory Workshop (ITW’2015), Jeju, South Korea, Oct. 11–15, 2015, pp. 214–218.
Funding
The research was supported in part by the Russian Foundation for Basic Research, project no. 19-01-00364.
Author information
Authors and Affiliations
Corresponding author
Additional information
Russian Text © The Author(s), 2020, published in Problemy Peredachi Informatsii, 2020, Vol. 56, No. 1, pp. 3–14.
Rights and permissions
About this article
Cite this article
Prelov, V.V. On the Maximum Values of f-Divergence and Rényi Divergence under a Given Variational Distance. Probl Inf Transm 56, 1–12 (2020). https://doi.org/10.1134/S0032946020010019
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1134/S0032946020010019