Loading [a11y]/accessibility-menu.js
Quantizing for maximum output entropy (Corresp.) | IEEE Journals & Magazine | IEEE Xplore

Quantizing for maximum output entropy (Corresp.)


Abstract:

The entropy at the output of a quantizer is equal to the average mutual information between unquantized and quantized random variables. Thus, for a fixed number of quanti...Show More

Abstract:

The entropy at the output of a quantizer is equal to the average mutual information between unquantized and quantized random variables. Thus, for a fixed number of quantization levels, output entropy is a reasonable information-theoretic criterion of quantizer fidelity. It is shown that, for a class of signal distributions, which includes the Gaussian, the quantizers with maximum output entropy (MOE) and minimum average error (MAE) are approximately the same within a multiplicative constant.
Published in: IEEE Transactions on Information Theory ( Volume: 17, Issue: 5, September 1971)
Page(s): 612 - 612
Date of Publication: 30 September 1971

ISSN Information:


Contact IEEE to Subscribe

References

References is not available for this document.