Elsevier

Neurocomputing

Volume 9, Issue 3, December 1995, Pages 243-269
Neurocomputing

Paper
Memory-based neural networks for robot learning

https://doi.org/10.1016/0925-2312(95)00033-6Get rights and content

Abstract

This paper explores a memory-based approach to robot learning, using memory-based neural networks to learn models of the task to be performed. Steinbuch and Taylor presented neural network designs to explicitly store training data and do nearest neighbor lookup in the early 1960s. In this paper their nearest neighbor network is augmented with a local model network, which fits a local model to a set of nearest neighbors. This network design is equivalent to a statistical approach known as locally weighted regression, in which a local model is formed to answer each query, using a weighted regression in which nearby points (similar experiences) are weighted more than distant points (less relevant experiences). We illustrate this approach by describing how it has been used to enable a robot to learn a difficult juggling task.

References (74)

  • R.E. Barnhill

    Representation and approximation of surfaces

  • P.E. Cheng

    Strong consistency of nearest neighbor regression function estimators

    J. Multivariate Analysis

    (1984)
  • W.S. Cleveland et al.

    Locally weighted regression: An approach to regression analysis by local fitting

    J. Amer. Statistical Assoc.

    (1988)
  • J.D. Cowan et al.

    Neural nets

    Quarterly Rev. Biophysics

    (1988)
  • I.K. Crain et al.

    Treatment of nonequispaced two dimensional data with a digital computer

    Geoexploration

    (1967)
  • L. Devroye

    On the almost everywhere convergence of nonparametric regression function estimates

    Annals Statistics

    (1981)
  • N.R. Draper et al.
  • R. Durbin et al.

    Product units: a computationally powerfull and biologically plausible extension to backpropagation networks

    Neural Computat.

    (1989)
  • R.L. Eubank
  • K.J. Falconer

    A general purpose algorithm for contouring over scattered data points

  • J.D. Famer et al.

    Exploiting chaos to predict the future and reduce noise

  • J.D. Farmer et al.

    Predicting chaotic dynamics

  • R. Farwig

    Multivariate interpolation of scattered data by moving least squares methods

  • E. Fix et al.

    Discriminatory analysis, nonparametric discrimination: Consistency properties

    E. Fix et al.
  • E. Fix et al.

    Discriminatory analysis: Small sample performance

    E. Fix et al.
  • R. Franke

    Scattered data interpolation: Tests of some methods

    Math. Computat.

    (1982)
  • R. Franke et al.

    Smooth interpolation of large sets of scattered data

    Int. J. Numerical Methods Eng.

    (1980)
  • J.H. Friedman et al.

    An algorithm for finding best matches in logarithmic expected time

    ACM Trans. Math. Software

    (Sep. 1977)
  • K.Y. Goldberg et al.

    Using a neural network to learn the dynamics of the CMU Direct-Drive Arm II

  • W.J. Gordon et al.

    Shepard's method of metric interpolation to bivariate and multivariate interpolation

    Math. Computat.

    (1978)
  • E. Grosse

    LOESS: Multivariate smoothing by moving least squares

  • F.R. Hampel et al.
  • D. Hillis
  • S. Kawamura et al.

    Memory-based control for recognition of motion environment and planning of effective locomotion

  • S. Kawamura et al.

    Hierarchical data structure in memory-based control of robots

  • H. Kazmierczak et al.

    Adaptive systems in pattern recognition

    IEEE Trans. Electronic Comput.

    (Dec. 1963)
  • P. Lancaster

    Moving weighted least-squares methods

  • Cited by (0)

    View full text