Abstract:
The Kullback-Leibler (KL) divergence is a discrepancy measure between probability distribution that plays a central role in information theory, statistics and machine lea...Show MoreMetadata
Abstract:
The Kullback-Leibler (KL) divergence is a discrepancy measure between probability distribution that plays a central role in information theory, statistics and machine learning. While there are numerous methods for estimating this quantity from data, a limit distribution theory which quantifies fluctuations of the estimation error is largely obscure. In this paper, we close this gap by identifying sufficient conditions on the population distributions for the existence of distributional limits and characterizing the limiting variables. These results are used to derive one- and two-sample limit theorems for Gaussian-smoothed KL divergence, both under the null and the alternative. Finally, an application of the limit distribution result to auditing differential privacy is proposed and analyzed for significance level and power against local alternatives.
Date of Conference: 25-30 June 2023
Date Added to IEEE Xplore: 22 August 2023
ISBN Information: