Abstract:
In applications involving estimation, the relevant model classes of probability distributions are often too complex to admit estimators that converge to the truth with co...Show MoreMetadata
Abstract:
In applications involving estimation, the relevant model classes of probability distributions are often too complex to admit estimators that converge to the truth with convergence rates that can be uniformly bounded over the entire model class as the sample size increases (uniform consistency). While it is often possible to get pointwise guarantees, so that the convergence rate of the estimator can be bounded in a model-dependent way, such pointwise gaurantees are unsatisfactory - estimator performance is a function of the very unknown quantity that is being estimated. Therefore, even if an estimator is consistent, how well it is doing may not be clear no matter what the sample size. Departing from this traditional uniform/pointwise dichotomy, a new analysis framework is explored by characterizing model classes of probability distributions that may only admit pointwise guarantees, yet where all the information about the unknown model needed to gauge estimator accuracy can be inferred from the sample at hand. To provide a focus to this suggested broad new paradigm, we analyze the universal compression problem in this data-driven pointwise consistency framework.
Published in: 2014 IEEE International Symposium on Information Theory
Date of Conference: 29 June 2014 - 04 July 2014
Date Added to IEEE Xplore: 11 August 2014
Electronic ISBN:978-1-4799-5186-4