Abstract:
LetX_1,X_2,\cdotsbe a sequence of independent identically distributed observations with a common mean\mu. Assume that0 \leq X_i \leq 1with probability 1. We show that for...Show MoreMetadata
Abstract:
LetX_1,X_2,\cdotsbe a sequence of independent identically distributed observations with a common mean\mu. Assume that0 \leq X_i \leq 1with probability 1. We show that for each\varepsilon > 0there exists an integerm, a finite-valued statisticT_n = T_n(X_1, \cdots , X_n) \in \{t_1,\cdots,t_m\}and a real-valued functionddefined on\{t_1,\cdots,t_m\}such thati)T_{n+1} = f_n(T_n,X_{n+1}); ii)P[\lim \sup \mid d(T_n) - \mu \mid \leq \varepsilon] = 1. Thus we have a recursive-like estimate of\mu, for which the data are summarized for eachnby one ofmstates and which converges to within\varepsilonof\muwith probability 1. The constraint on memory here is time varying as contrasted to the time-invariant constraint that would haveT_{n+1} = f(T_n, X_{n+1})for alln.
Published in: IEEE Transactions on Information Theory ( Volume: 18, Issue: 4, July 1972)