Elsevier

Information and Computation

Volume 237, October 2014, Pages 101-141
Information and Computation

Generalised entropies and asymptotic complexities of languages

https://doi.org/10.1016/j.ic.2014.01.001Get rights and content
Under an Elsevier user license
open archive

Abstract

The paper explores connections between asymptotic complexity and generalised entropy. Asymptotic complexity of a language (a language is a set of finite or infinite strings) is a way of formalising the complexity of predicting the next element in a sequence: it is the loss per element of a strategy asymptotically optimal for that language. Generalised entropy extends Shannon entropy to arbitrary loss functions; it is the optimal expected loss given a distribution on possible outcomes. It turns out that the set of tuples of asymptotic complexities of a language w.r.t. different loss functions can be described by means of the generalised entropies corresponding to the loss functions.

Cited by (0)

A previous version of this paper was published in: Proceedings of the 20th Annual Conference on Learning Theory, COLT 2007, Lecture Notes in Computer Science, vol. 4539, Springer, 2007.