Loading [a11y]/accessibility-menu.js
Log Ratio of Entropy Powers | IEEE Conference Publication | IEEE Xplore

Log Ratio of Entropy Powers


Abstract:

In his landmark 1948 paper [1], Shannon defined what he called the derived quantity of entropy power, also called entropy rate power, to be the power in a Gaussian white ...Show More

Abstract:

In his landmark 1948 paper [1], Shannon defined what he called the derived quantity of entropy power, also called entropy rate power, to be the power in a Gaussian white noise limited to the same band as the original ensemble and having the same entropy. He then used the entropy power in bounding the capacity of certain channels and for specifying a lower bound on the rate distortion function of a source. Kolmogorov [2] and Pinsker [3] derived an expression for the entropy rate power of a discrete-time stationary random process in terms of its power spectral density. The entropy power inequality also plays an important role in multiterminal information theory. These have been the major applications of entropy power in the last (almost) 70 years. We reconsider entropy rate power and use the log ratio of entropy powers to analyze the performance of tandem communications and signal processing systems in terms of the change in mutual information with each stage. We also examine ways to calculate or approximate the entropy rate power when performing the analyses.
Date of Conference: 11-16 February 2018
Date Added to IEEE Xplore: 25 October 2018
ISBN Information:
Conference Location: San Diego, CA, USA

Contact IEEE to Subscribe

References

References is not available for this document.