Skip to main content

Shannon Entropy vs. Kolmogorov Complexity

  • Conference paper
Computer Science – Theory and Applications (CSR 2006)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3967))

Included in the following conference series:

  • 1105 Accesses

Abstract

Most assertions involving Shannon entropy have their Kolmogorov complexity counterparts. A general theorem of Romashchenko [4] states that every information inequality that is valid in Shannon’s theory is also valid in Kolmogorov’s theory, and vice verse. In this paper we prove that this is no longer true for ∀ ∃-assertions, exhibiting the first example where the formal analogy between Shannon entropy and Kolmogorov complexity fails.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Sipser, M.: Expanders, randomness, or time versus space. J. Comput. and System Sci. 36, 379–383 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  2. Bennett, C.H., Gács, P., Li, M., Vitányi, P., Zurek, W.: Information Distance. IEEE Transactions on Information Theory 44(4), 1407–1423 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  3. Chernov, A., Muchnik, A., Romashchenko, A., Shen, A., Vereshchagin, N.: Upper semi-lattice of binary strings with the relation ‘x is simple conditional to y. Theoretical Computer Science 271, 69–95 (2002); Preliminary version in: 14th Annual IEEE Conference on Computational Complexity, Atlanta, May 4-6, pp. 114–122 (1999)

    Google Scholar 

  4. Hammer, D., Romashchenko, A., Shen, A., Vereshchagin, N.: Inequalities for Shannon entropy and Kolmogorov complexity. Journal of Computer and Systems Sciences 60, 442–464 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  5. Li, M., Vitányi, P.M.B.: An Introduction to Kolmogorov Complexity and its Applications, 2nd edn. Springer, New York (1997)

    Book  MATH  Google Scholar 

  6. Kolmogorov, A.N.: Three approaches to the quantitative definition of information. Problems Inform. Transmission 1(1), 1–7 (1965)

    MathSciNet  MATH  Google Scholar 

  7. Slepian, D., Wolf, J.K.: Noiseless Coding of Correlated Information Sources. IEEE Trans. Inform. Theory IT-19, 471–480 (1973)

    Article  MathSciNet  MATH  Google Scholar 

  8. Muchnik, A.A.: Conditional complexity and codes. Theoretical Computer Science 271, 97–109 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  9. Shannon, C.E.: A mathematical theory of communication. Bell Sys. Tech. J. 27, 379–423 and 623–656 (1948)

    Google Scholar 

  10. Solomonoff, R.J.: A formal theory of inductive inference, Part 1 and Part 2. Information and Control 7, 1–22 and 224–254 (1964)

    Google Scholar 

  11. Uspensky, V.A., Shen, A.: Relations Between Varieties of Kolmogorov Complexities. Mathematical Systems Theory 29(3), 271–292 (1996)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Muchnik, A., Vereshchagin, N. (2006). Shannon Entropy vs. Kolmogorov Complexity. In: Grigoriev, D., Harrison, J., Hirsch, E.A. (eds) Computer Science – Theory and Applications. CSR 2006. Lecture Notes in Computer Science, vol 3967. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11753728_29

Download citation

  • DOI: https://doi.org/10.1007/11753728_29

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-34166-6

  • Online ISBN: 978-3-540-34168-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics