Skip to main content

Comparing Notions of Computational Entropy

  • Conference paper
Computation and Logic in the Real World (CiE 2007)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4497))

Included in the following conference series:

  • 1273 Accesses

Abstract

In the information theoretic world, entropy is both the measure of randomness in a source and a lower bound for the compression achievable for that source by any encoding scheme. But when we must restrict ourselves to efficient schemes, entropy no longer captures these notions well. For example, there are distributions with very low entropy that nonetheless look random for polynomial-bound algorithms.

Different notions of computational entropy have been proposed to take the role of entropy in such settings. Results in [GS91] and [Wee04]) suggest that when time bounds are introduced, the entropy of a distribution no longer coincides with the most effective compression for that source.

This paper analyses three measures that try to capture the compressibility of a source, establishing relations and separations between them and analysing the two special cases of the uniform and the universal distribution m t over binary strings of a fixed size. It is shown that for the uniform distribution the three measures are equivalent and that for m t there is a clear separation between metric type entropy, and thus pseudo-entropy, and the maximum compressibility of a source.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Barak, B., Shaltiel, R., Widgerson, A.: Computational Analogues of Entropy. In: Arora, S., Jansen, K., Rolim, J.D.P., Sahai, A. (eds.) RANDOM 2003 and APPROX 2003. LNCS, vol. 2764, pp. 200–215. Springer, Heidelberg (2003), Available at http://www.math.ias.edu/~avi/PUBLICATIONS/MYPAPERS/BSW03/bsw03.ps

    Google Scholar 

  2. Chaitin, G.J.: On the length of programs for computing finite binary sequences. Journal of the ACM 13(4), 145–149 (1966)

    Article  MathSciNet  MATH  Google Scholar 

  3. Goldberg, A., Sipser, M.: Compression and Ranking. SIAM Journal On Computing 20(3), 524–536 (1991)

    Article  MathSciNet  MATH  Google Scholar 

  4. Grünwald, P., Vitányi, P.: Kolmogorov Complexity and Information Theory. Journal of Logic, Language and Information 12(4), 497–529 (2003), Available at http://citeseer.ist.psu.edu/565384.html

    Article  MathSciNet  MATH  Google Scholar 

  5. Hastad, J., Impagliazzo, R., Levin, L., Luby, M.: A Pseudorandom Generator from any One-way Function. SIAM Journal On Computing 28(4), 1364–1396 (1999), Available at http://citeseer.ist.psu.edu/hastad99pseudorandom.html

    Article  MathSciNet  MATH  Google Scholar 

  6. Kolmogorov, A.N.: Three approaches to the quantitative definition of information. Problems Inform. Transmission 1(1), 1–7 (1965)

    MathSciNet  MATH  Google Scholar 

  7. Li, M., Vitányi, P.M.B.: An introduction to Kolmogorov complexity and its applications, 2nd edn. Springer, Heidelberg (1997)

    Book  MATH  Google Scholar 

  8. Shannon, C.E.: A mathematical theory of communication. Bell System Technical Journal, vol. 27, pp. 379–423 and 623–656, July and October (1948)

    Google Scholar 

  9. Wee, H.: On Pseudoentropy versus Compressibility. IEEE Conference On Computational Complexity, pp. 29–41, (2004) Available at http://ieeexplore.ieee.org/iel5/9188/29139/01313782.pdf

  10. Yao, A.: Computational Information Theory. In: Complexity in Information Theory, pp. 1–15. Springer, Heidelberg (1988)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Pinto, A. (2007). Comparing Notions of Computational Entropy. In: Cooper, S.B., Löwe, B., Sorbi, A. (eds) Computation and Logic in the Real World. CiE 2007. Lecture Notes in Computer Science, vol 4497. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-73001-9_63

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-73001-9_63

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-73000-2

  • Online ISBN: 978-3-540-73001-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics