Skip to main content
Log in

On the Maximum Values of f-Divergence and Rényi Divergence under a Given Variational Distance

  • Information Theory
  • Published:
Problems of Information Transmission Aims and scope Submit manuscript

Abstract

We consider the problem of finding maximum values of f-divergences Df(P ∥ Q) of discrete probability distributions P and Q with values on a finite set under the condition that the variation distance V(P, Q) between them and one of the distributions P or Q are given. We obtain exact expressions for such maxima of f-divergences, which in a number of cases allow to obtain both explicit formulas and upper bounds for them. As a consequence, we obtain explicit expressions for the maxima of f-divergences Df (PQ) given that, besides V(P, Q), we only know the value of the maximum component of either P or Q. Analogous results are also obtained for the Rényi divergence.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Csiszár, I., Eine Informationstheoretische Ungleichung und ihre Anwendung auf den Bewis der Ergodizität von Markoffschen Ketten, Magyar Tud. Akad. Mat. Kutató Int. Közl. Ser. A, 1963, vol. 8, pp. 85–108.

    MATH  Google Scholar 

  2. Csiszár, I., Information-type Measures of Difference of Probability Distributions and Indirect Observations, Studia Sci. Math. Hungar., 1967, vol. 2, no. 3–4, pp. 299–318.

    MathSciNet  MATH  Google Scholar 

  3. Ali, S.M. and Silvey, S.D., A General Class of Coefficients of Divergence of One Distribution from Another, J. Roy. Statist. Soc. Ser. B, 1966, vol. 28, no. 1, pp. 131–142.

    MathSciNet  MATH  Google Scholar 

  4. Ziv, J. and Zakai, M., On Functionals Satisfying a Data-Processing Theorem, IEEE Trans. Inform. Theory, 1973, vol. 19, no. 3, pp. 275–283.

    Article  MathSciNet  Google Scholar 

  5. Rényi, A., On Measures of Entropy and Information, Proc. 4th Berkeley Sympos. on Mathematical Statistics and Probability, Berkely, CA, USA, June 20–July 30, 1960, Neyman, J., Ed., Berkely: Univ. of California Press, 1961, vol. 1: Contributions to the Theory of Statistics, pp. 547–561.

    Google Scholar 

  6. van Erven, T. and Harremoës, P., Rényi Divergence and Kullback-Leibler Divergence, IEEE Trans. Inform. Theory, 2014, vol. 60, no. 7, pp. 3797–3820.

    Article  MathSciNet  Google Scholar 

  7. Liese, F. and Vajda, I., On Divergences and Information in Statistics and Information Theory, IEEE Trans. Inform. Theory, 2006, vol. 52, no. 10, pp. 4394–4412.

    Article  MathSciNet  Google Scholar 

  8. Aczel, J. and Daróczy, Z., On Measures of Information and Their Characterization, New York: Academic, 1975.

    MATH  Google Scholar 

  9. Gilardoni, G.L., On Pinsker’s and Vajda’s Type Inequalities for Csiszár’s f-Divergences, IEEE Trans. Inform. Theory, 2010, vol. 56, no. 11, pp. 5377–5386.

    Article  MathSciNet  Google Scholar 

  10. Guntuboyina, A., Saha, S., and Schiebinger, G., Sharp Inequalities for f-Divergences, IEEE Trans. Inform. Theory, 2014, vol. 60, no. 1, pp. 104–121.

    Article  MathSciNet  Google Scholar 

  11. Sason, I., Tight Bounds for Symmetric Divergence Measures and a Refined Bound for Lossless Source Coding, IEEE Trans. Inform. Theory, 2015, vol. 61, no. 2, pp. 701–707.

    Article  MathSciNet  Google Scholar 

  12. Sason, I. and Verdú, S., f-Divergence Inequalities, IEEE Trans. Inform. Theory, 2016, vol. 62, no. 11, pp. 5973–6006.

    Article  MathSciNet  Google Scholar 

  13. Gilardoni, G.L., On the Minimum f-Divergence for Given Total Variation, C. R. Acad. Sci. Paris Ser. I, 2006, vol. 343, no. 11–12, pp. 763–766.

    Article  MathSciNet  Google Scholar 

  14. Prelov, V.V., Optimal Upper Bounds for the Divergence of Finite-Dimensional Distributions under a Given Variational Distance, Probl. Peredachi Inf., 2019, vol. 55, no. 3, pp. 21–29 [Probl. Inf. Transm. (Engl. Transl.), 2019, vol. 55, no. 3, pp. 218–225].

    MathSciNet  Google Scholar 

  15. Sason, I. and Verdú, S., Upper Bounds on the Relative Entropy and Rényi Divergence as a Function of Total Variation Distance for Finite Alphabets, in Proc. 2015 IEEE Information Theory Workshop (ITW’2015), Jeju, South Korea, Oct. 11–15, 2015, pp. 214–218.

Download references

Funding

The research was supported in part by the Russian Foundation for Basic Research, project no. 19-01-00364.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to V. V. Prelov.

Additional information

Russian Text © The Author(s), 2020, published in Problemy Peredachi Informatsii, 2020, Vol. 56, No. 1, pp. 3–14.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Prelov, V.V. On the Maximum Values of f-Divergence and Rényi Divergence under a Given Variational Distance. Probl Inf Transm 56, 1–12 (2020). https://doi.org/10.1134/S0032946020010019

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1134/S0032946020010019

Key words

Navigation