Optimal estimation of families of models | IEEE Conference Publication | IEEE Xplore

Optimal estimation of families of models


Abstract:

Given a class of parametric models Mk = {f(xn; thetas, k) : thetas = thetas1, hellip , thetask isin Omegak}, where xn = x1,hellip, xn denotes real-valued data and thetas ...Show More

Abstract:

Given a class of parametric models Mk = {f(xn; thetas, k) : thetas = thetas1, hellip , thetask isin Omegak}, where xn = x1,hellip, xn denotes real-valued data and thetas parameters. Cramer-Rao inequality gives a lower bound for the covariance of the estimation error when only one model is to be estimated, and the maximum likelihood estimator achieves the lower bound asymptotically, which, moreover, shrinks to zero as n rarr infin. We study the more complicated problem when a family of models f(xn; thetas1), hellip , f(xn; thetasm) is to be estimated. In fact, hypothesis testing may be viewed as such a problem. We show that there is a family of models, which we call optimally distinguishable, which can be estimated with the smallest worst case error. Moreover, if we let their number mn grow, there is a fastest growth rate such that if the number mn grows more slowly, the members can be consistently estimated; i.e. estimated without error in the limit as n rarr infin, and otherwise not. This is reminiscent of and related to Shannonpsilas channel capacity, which, however, as such cannot be applied to the problem considered.
Date of Conference: 05-09 May 2008
Date Added to IEEE Xplore: 25 July 2008
ISBN Information:
Conference Location: Porto, Portugal