Abstract:
Based on rescaling by some suitable sequence instead of the number of time units, the usual notion of divergence rate is here extended to define and determine meaningful ...Show MoreMetadata
Abstract:
Based on rescaling by some suitable sequence instead of the number of time units, the usual notion of divergence rate is here extended to define and determine meaningful generalized divergence rates. Rescaling entropy rates appears as a special case. Suitable rescaling is naturally induced by the asymptotic behavior of the marginal divergences. Closed-form formulas are obtained as soon as the marginal divergences behave like powers of some analytical functions. A wide class of countable Markov chains is proved to satisfy this property. Most divergence and entropy functionals defined in the literature are concerned, e.g., the classical Shannon, Kullback-Leibler, Rényi, and Tsallis. For illustration purposes, Ferreri or Basu-Harris-Hjort-Jones - among others - are also considered.
Published in: IEEE Transactions on Information Theory ( Volume: 61, Issue: 11, November 2015)