Loading [a11y]/accessibility-menu.js
Estimation of KL Divergence: Optimal Minimax Rate | IEEE Journals & Magazine | IEEE Xplore

Estimation of KL Divergence: Optimal Minimax Rate


Abstract:

The problem of estimating the Kullback-Leibler divergence D(P∥Q) between two unknown distributions P and Q is studied, under the assumption that the alphabet size k of th...Show More

Abstract:

The problem of estimating the Kullback-Leibler divergence D(P∥Q) between two unknown distributions P and Q is studied, under the assumption that the alphabet size k of the distributions can scale to infinity. The estimation is based on m independent samples drawn from P and n independent samples drawn from Q. It is first shown that there does not exist any consistent estimator that guarantees asymptotically small worst case quadratic risk over the set of all pairs of distributions. A restricted set that contains pairs of distributions, with density ratio bounded by a function f (k) is further considered. An augmented plug-in estimator is proposed, and its worst case quadratic risk is shown to be within a constant factor of ((k/m) + (kf (k)/n))2 + (log2f (k)/m) + ( f (k)/n), if m and n exceed a constant factor of k and kf (k), respectively. Moreover, the minimax quadratic risk is characterized to be within a constant factor of ((k/(m log k)) + (kf (k)/(n log k)))2 + (log2f (k)/m) + ( f (k)/n), if m and n exceed a constant factor of k/ log(k) and kf (k)/ log k, respectively. The lower bound on the minimax quadratic risk is characterized by employing a generalized Le Cam's method. A minimax optimal estimator is then constructed by employing both the polynomial approximation and the plug-in approaches.
Published in: IEEE Transactions on Information Theory ( Volume: 64, Issue: 4, April 2018)
Page(s): 2648 - 2674
Date of Publication: 13 February 2018

ISSN Information:

Funding Agency:


References

References is not available for this document.