Abstract:
Let X and Y be dependent random variables. This paper considers the problem of designing a scalar quantizer for Y to maximize the mutual information between the quantizer...Show MoreMetadata
Abstract:
Let X and Y be dependent random variables. This paper considers the problem of designing a scalar quantizer for Y to maximize the mutual information between the quantizer's output and X, and develops fundamental properties and bounds for this form of quantization, which is connected to the log-loss distortion criterion. The main focus is the regime of low I(X;Y), where it is shown that, if X is binary, a constant fraction of the mutual information can always be preserved using O(log(1/I(X;Y))) quantization levels, and there exist distributions for which this many quantization levels are necessary. Furthermore, for larger finite alphabets 2 <; |X| <; ∞, it is established that an η-fraction of the mutual information can be preserved using roughly (log(| X | /I(X;Y)))η·(|X| - 1) quantization levels.
Published in: IEEE Transactions on Information Theory ( Volume: 67, Issue: 4, April 2021)