Abstract:
We study universal compression of n i.i.d. copies of a k−variate Gaussian random vector, when the mean is an unknown vector in an Euclidean ball of ℝk, and the covariance...Show MoreMetadata
Abstract:
We study universal compression of n i.i.d. copies of a k−variate Gaussian random vector, when the mean is an unknown vector in an Euclidean ball of ℝk, and the covariance is known. We adopt the high dimensional scaling k = Θ(n) to bring out a compression perspective on the inadmissibility of unbiased estimates of a k−variate Gaussian (when k ≥ 3), in particular focusing on the optimal unbiased Maximum Likelihood estimate. We use arguments based on the redundancy-capacity theorem to show that the redundancy of a universal compressor in this high dimensional setting must be lower bounded as Θ(n). We show that natural compression schemes based on the Maximum Likelihood estimate of the mean have suboptimal Θ(n log n) redundancy, but a scheme based on the James-Stein biased estimate of the mean incurs redundancy that is also Θ(n).
Date of Conference: 25-30 June 2023
Date Added to IEEE Xplore: 22 August 2023
ISBN Information: