Abstract
Incremental learning of sensorimotor transformations in high dimensional spaces is one of the basic prerequisites for the success of autonomous robot devices as well as biological movement systems. So far, due to sparsity of data in high dimensional spaces, learning in such settings required a significant amount of prior knowledge about the learning task, usually provided by a human expert. In this paper we suggest a partial revision of the view. Based on empirical studies, we observed that, despite being globally high dimensional and sparse, data distributions from physical movement systems are locally low dimensional and dense. Under this assumption, we derive a learning algorithm, Locally Adaptive Subspace Regression, that exploits this property by combining a dynamically growing local dimensionality reduction technique as a preprocessing step with a nonparametric learning technique, locally weighted regression, that also learns the region of validity of the regression. The usefulness of the algorithm and the validity of its assumptions are illustrated for a synthetic data set, and for data of the inverse dynamics of human arm movements and an actual 7 degree-of-freedom anthropomorphic robot arm.
Similar content being viewed by others
References
C.H. An, C.G. Atkeson and J.M. Hollerbach (1988), Model Based Control of a Robot Manipulator, MIT Press: Cambridge, MA.
C.G. Atkeson (1989), Using local models to control movement, in: D. Touretzky (ed), Advances in Neural Information Processing Systems 1, San Mateo, CA: Morgan Kauffman.
C.G. Atkeson, A.W. Moore and S. Schaal (1997a), Locally Weighted Learning for Control, Artificial Intelligence Review.
C.G. Atkeson, A.W. Moore and S. Schaal (1997b), Locally Weighted Learning, Artificial Intelligence Review.
W.S. Cleveland (1979), “Robust locally weighted regression and smoothing scatterplots”, Journal of the American Statistical Association, Vol. 74, pp. 829-836.
S. Geman, E. Bienenstock and R. Doursat (1992), “Neural networks and the bias-variance dilemma”, Neural Computation, No. 4, pp. 1-58.
T.J. Hastie and R.J. Tibshirani (1990), Generalized Additive Models, London: Chapman-Hall.
N. Kambhatla and T.K. Leen (1994), in: D.S. Touretzky, M.C. Mozer and M.E. Hasselmo (eds), Advances in Neural Information Processing Systems 6, San Fransico, CA: Morgan Kaufmann Publishers.
L. Ljung and T. Soederstroem (1986), Theory and Practice of Recursive Identification, Cambridge, MIT Press.
E. Oja (1982), “A simplified neuron model as a principal component analyzer”, Journal of Mathematical Biology, Vol. 15, pp. 267-273.
J. Rissanen (1989), Stochastic complexity in statistical enquiry, Singapore: World Scientific.
T.D. Sanger (1989), “Optimal unsupervised learning in a single layer linear feedforward neural network”, Neural Networks, Vol. 2, pp. 459-473.
S. Schaal and C.G. Atkeson (1996), “From isolation to cooperation: An alternative view of a system of experts”, in: D.S. Touretzky, M.C. Mozer and M.E. Hasselmo (Eds.), Advances in Neural Information Processing Systems 8, Cambridge, MA: MIT Press.
S. Schaal and C.G. Atkeson (1997), “Receptive field weighted regression”, Technical Report TR-H-209, ATR Human Information Processing Labs., Kyoto 619-02, Japan.
S. Schaal (1997), “Learning from demonstration”, Advances in Neural Information Processing Systems 9, pp. 1040–1046, Cambridge, New York: Wiley.
D.W. Scott, (1992), Multivariate Density Estimation, New York: Wiley.
R.S. Sutton, (1992), “Adapting bias by gradient descent: An incremental version of Delta-Bar-Delta”, Proc. Tenth National Conf. Artificial Intelligence, pp. 171-176.
I.H. Witten, R.M. Neal and J.G. Cleary (1997), “Arithmetic coding for data compression”, Communications of the ACM, Vol. 30, pp. 520-540.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Vijayakumar, S., Schaal, S. Local Adaptive Subspace Regression. Neural Processing Letters 7, 139–149 (1998). https://doi.org/10.1023/A:1009696221209
Issue Date:
DOI: https://doi.org/10.1023/A:1009696221209