Abstract
Motivated by sensor networks and other distributed settings, several models for distributed learning are presented. The models differ from classical works in statistical pattern recognition by allocating observations of an i.i.d. sampling process amongst members of a network of learning agents. The agents are limited in their ability to communicate to a fusion center; the amount of information available for classification or regression is constrained. For several simple communication models, questions of universal consistency are addressed; i.e., the asymptotics of several agent decision rules and fusion rules are considered in both binary classification and regression frameworks. These models resemble distributed environments and introduce new questions regarding universal consistency. Insofar as these models offer a useful picture of distributed scenarios, this paper considers whether the guarantees provided by Stone’s Theorem in centralized environments hold in distributed settings.
This research was supported in part by the Army Research Office under grant DAAD19-00-1-0466, in part by Draper Laboratory under grant IR&D 6002, in part by the National Science Foundation under grant CCR-0312413, and in part by the Office of Naval Research under Grant No. N00014-03-1-0102.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Stone, C.J.: Consistent nonparametric regression. Ann. Statist. 5, 595–645 (1977)
Devroye, L., Györfi, L., Lugosi, G.: A Probabilistic Theory of Pattern Recognition. Springer, New York (1996)
Györfi, L., Kohler, M., Krzyzak, A., Walk, H.: A Distribution-Free Theory of Nonparametric Regression. Springer, New York (2002)
Akyildiz, I.F., Su, W., Sankarasubramaniam, Y., Cayirci, E.: A survey on sensor networks. IEEE Communications Magazine 40, 102–114 (2002)
Cover, T.M.: Rates of convergence for nearest neighbor procedures. In: Proc. 1st Annu. Hawaii Conf. Systems Theory, pp. 413–415 (1968)
Greblicki, W., Pawlak, M.: Necessary and sufficient conditions for bayes risk consistency of recursive kernel classification rule. IEEE Trans. Inform. Theory IT-33, 408–412 (1987)
Krzyzak, A.: The rates of convergence of kernel regression estimates and classification rules. IEEE Trans. Inform. Theory IT-32, 668–679 (1986)
Kulkarni, S.R., Posner, S.E.: Rates of convergence of nearest neighbor estimation under arbitrary sampling. IEEE Trans. Inform. Theory 41, 1028–1039 (1995)
Kulkarni, S.R., Posner, S.E., Sandilya, S.: Data-dependent kn-nn and kernel estimators consistent for arbitrary processes. IEEE. Trans. Inform. Theory 48, 2785–2788 (2002)
Morvai, G., Kulkarni, S.R., Nobel, A.B.: Regression estimation from an individual stable sequence. Statistics 33, 99–119 (1999)
Nobel, A.B.: Limits to classification and regression estimation from ergodic processes. Ann. Statist. 27, 262–273 (1999)
Nobel, A.B., Adams, T.M.: On regression estimation from ergodic samples with additive noise. IEEE Trans. Inform. Theory 47, 2895–2902 (2001)
Roussas, G.: Nonparametric estimation in markov processes. Ann. Inst. Statist. Math. 21, 73–87 (1967)
Yakowitz, S.: Nearest neighbor regression estimation for null-recurrent markov time series. Stoch. Processes Appl. 48, 311–318 (1993)
Lugosi, G.: Learning with an unreliable teacher. Pattern Recognition 25, 79–87 (1992)
Kolmogorov, A.N., Fomin, S.V.: Introductory Real Analysis. Dover, New York (1975)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Predd, J.B., Kulkarni, S.R., Poor, H.V. (2004). Consistency in Models for Communication Constrained Distributed Learning. In: Shawe-Taylor, J., Singer, Y. (eds) Learning Theory. COLT 2004. Lecture Notes in Computer Science(), vol 3120. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-27819-1_31
Download citation
DOI: https://doi.org/10.1007/978-3-540-27819-1_31
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-22282-8
Online ISBN: 978-3-540-27819-1
eBook Packages: Springer Book Archive