Abstract
In the information theoretic world, entropy is both the measure of randomness in a source and a lower bound for the compression achievable for that source by any encoding scheme. But when we must restrict ourselves to efficient schemes, entropy no longer captures these notions well. For example, there are distributions with very low entropy that nonetheless look random for polynomial-bound algorithms.
Different notions of computational entropy have been proposed to take the role of entropy in such settings. Results in [GS91] and [Wee04]) suggest that when time bounds are introduced, the entropy of a distribution no longer coincides with the most effective compression for that source.
This paper analyses three measures that try to capture the compressibility of a source, establishing relations and separations between them and analysing the two special cases of the uniform and the universal distribution m t over binary strings of a fixed size. It is shown that for the uniform distribution the three measures are equivalent and that for m t there is a clear separation between metric type entropy, and thus pseudo-entropy, and the maximum compressibility of a source.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Barak, B., Shaltiel, R., Widgerson, A.: Computational Analogues of Entropy. In: Arora, S., Jansen, K., Rolim, J.D.P., Sahai, A. (eds.) RANDOM 2003 and APPROX 2003. LNCS, vol. 2764, pp. 200–215. Springer, Heidelberg (2003), Available at http://www.math.ias.edu/~avi/PUBLICATIONS/MYPAPERS/BSW03/bsw03.ps
Chaitin, G.J.: On the length of programs for computing finite binary sequences. Journal of the ACM 13(4), 145–149 (1966)
Goldberg, A., Sipser, M.: Compression and Ranking. SIAM Journal On Computing 20(3), 524–536 (1991)
Grünwald, P., Vitányi, P.: Kolmogorov Complexity and Information Theory. Journal of Logic, Language and Information 12(4), 497–529 (2003), Available at http://citeseer.ist.psu.edu/565384.html
Hastad, J., Impagliazzo, R., Levin, L., Luby, M.: A Pseudorandom Generator from any One-way Function. SIAM Journal On Computing 28(4), 1364–1396 (1999), Available at http://citeseer.ist.psu.edu/hastad99pseudorandom.html
Kolmogorov, A.N.: Three approaches to the quantitative definition of information. Problems Inform. Transmission 1(1), 1–7 (1965)
Li, M., Vitányi, P.M.B.: An introduction to Kolmogorov complexity and its applications, 2nd edn. Springer, Heidelberg (1997)
Shannon, C.E.: A mathematical theory of communication. Bell System Technical Journal, vol. 27, pp. 379–423 and 623–656, July and October (1948)
Wee, H.: On Pseudoentropy versus Compressibility. IEEE Conference On Computational Complexity, pp. 29–41, (2004) Available at http://ieeexplore.ieee.org/iel5/9188/29139/01313782.pdf
Yao, A.: Computational Information Theory. In: Complexity in Information Theory, pp. 1–15. Springer, Heidelberg (1988)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Pinto, A. (2007). Comparing Notions of Computational Entropy. In: Cooper, S.B., Löwe, B., Sorbi, A. (eds) Computation and Logic in the Real World. CiE 2007. Lecture Notes in Computer Science, vol 4497. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-73001-9_63
Download citation
DOI: https://doi.org/10.1007/978-3-540-73001-9_63
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-73000-2
Online ISBN: 978-3-540-73001-9
eBook Packages: Computer ScienceComputer Science (R0)