Abstract
We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (``algorithmic'') mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of information as our guiding motif, and we explain howit relates to sequential question-answer sessions.
Similar content being viewed by others
References
Chaitin, G.J., 1987, Algorithmic Information Theory, Cambridge: Cambridge University Press.
Cover, T.M. and Thomas, J.A., 1991, Elements of Information Theory, New York: Wiley Interscience.
Gács, P., 1974, “On the symmetry of algorithmic information,” Soviet Math. Dokl. 15, 1477–1480, 1974. Correction, Soviet Mathematical Doklady 15, 1480.
Gács, P., Tromp, J., and Vitányi, P., 2001, “Algorithmic statistics,” IEEE Transactions on Information Theory 47, 2443–2463.
Grünwald, P.D., 2003, Manuscript, CWI.
Fisher, R.A., 1922, “On the mathematical foundations of theoretical statistics,” Philosophical Transactions of the Royal Society of London A 222, 309–368.
Kolmogorov, A.N., 1965, “Three approaches to the quantitative definition of information,” Problems Information Transmission 1, 1–7.
Kolmogorov, A.N., 1974, Talk at the Information Theory Symposium in Tallinn, Estonia, according to P. Gács and T. Cover who attended it.
Li, M. and Vitányi, P.M.B., 1997, An Introduction to Kolmogorov Complexity and its Applications, revised and expanded second edition, New York: Springer-Verlag.
Merhav, N. and Feder, M., 1998, “Universal prediction,” IEEE Transactions on Information Theory 44, 2124–2147.
Rissanen, J., 1989, Stochastic Complexity in Statistical Inquiry, Singapore: World Scientific.
Rissanen, J., 2002, “Kolmogorov's structure function for probability models,” pp. 98–99 in Proceedings IEEE Information Theory Workshop, New York: IEEE Press.
Shannon, C.E., 1948, “The mathematical theory of communication,” Bell System Technical Journal 27, 379–423, 623–656.
Shen', A.Kh., 1983, “The concept of Kolmogorov (αá, β)-stochasticity and its properties,” Soviet Mathematical Doklady 28, 295–299.
van Rooy, R., 2003, “Quality and quantity of information exchange,” Journal of Logic, Language and Information 12, 423–451.
Vereshchagin, N.K. and Vitányi, P.M.B., 2002, “Kolmogorov's structure functions and an application to the foundations of model selection,” pp. 751–760 in Proceedings 47th IEEE Symposium on Foundation and Computer Science, New York: IEEE Press.
Vitányi, P., 2002, “Meaningful information,” pp. 588–599 in Proceedings of 13th International Symposium on Algorithms and Computation (ISAAC), Lecture Notes in Computer Science, Vol. 2518, Berlin: Springer-Verlag.
Vitányi, P.M.B. and Li, M., 2000, “Minimum description length induction, Bayesianism, and Kolmogorov complexity,” IEEE Transactions on Information Theory 46, 446–464.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Grünwald, P.D., Vitányi, P.M. Kolmogorov Complexity and Information Theory. With an Interpretation in Terms of Questions and Answers . Journal of Logic, Language and Information 12, 497–529 (2003). https://doi.org/10.1023/A:1025011119492
Issue Date:
DOI: https://doi.org/10.1023/A:1025011119492