Skip to main content

Asymptotic distributions associated to unsupervised Oja's learning equation

  • Part IV: Signal Processing: Blind Source Separation, Vector Quantization, and Self Organization
  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN'97 (ICANN 1997)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1327))

Included in the following conference series:

  • 298 Accesses

Abstract

In this paper, we perform a complete asymptotic performance analysis of the stochastic approximation algorithm (denoted Subspace Network Learning algorithm) derived from Oja's learning equation, in the case where the learning rate is constant and a large number of patterns is available. Using a general result of Gaussian approximation theory, we derive the asymptotic distribution of the estimated projection matrix WW T associated to the connection weight matrix W. Closed form expressions of the asymptotic covariance of the projection matrix estimated by the SNL algorithm, and by the smoothed SNL algorithm that we introduce, are given.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. E. Oja, “Principal components, minor components and linear neural networks,” Neural Networks, vol. 5, pp. 927–935, 1992.

    Google Scholar 

  2. A. Benveniste, M. Mötivier, P. Priouret, Adaptive algorithms and stochastic approximations, Springer Verlag, 1990.

    Google Scholar 

  3. R. Williams, “Feature discovery through error-correcting learning,” Technical Report 8.501, San Diego, CA: University of California, Institute of Cognitive Science, 1985.

    Google Scholar 

  4. P. Baldi, “Linear learning: Landscapes and algorithms,” in Proc. NIPS, Denver, 1988.

    Google Scholar 

  5. E. Oja, “Neural Networks, principal components and subspaces,” International Journal of Neural Systems, vol.1, no.1 pp. 61–68, 1989.

    Google Scholar 

  6. W.Y. Yan, U. Helmke, J.B. Moore, “Global analysis of Oja's flow for neural networks,” IEEE Trans. on Neural Networks, vol. 5, no. 5, pp. 674–683, Sep. 1994.

    Google Scholar 

  7. L. Russo, “An outer product neural network for extracting PC from a time series,” In B.H. Juang et al. N.N. for S.P., pp. 161-170. NY IEEE Press, 1991.

    Google Scholar 

  8. B. Yang, “Projection approximation subspace tracking,” IEEE, Trans. on Signal Processing, vol. 43, no. 1, pp. 95–107, Jan. 1995.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Wulfram Gerstner Alain Germond Martin Hasler Jean-Daniel Nicoud

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Delmas, J.P. (1997). Asymptotic distributions associated to unsupervised Oja's learning equation. In: Gerstner, W., Germond, A., Hasler, M., Nicoud, JD. (eds) Artificial Neural Networks — ICANN'97. ICANN 1997. Lecture Notes in Computer Science, vol 1327. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0020227

Download citation

  • DOI: https://doi.org/10.1007/BFb0020227

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-63631-1

  • Online ISBN: 978-3-540-69620-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics