Abstract
In this paper, the dynamical behavior of Oja’s neural network [7] is analyzed. Oja’s net has been traditionally studied in the continuous-time context via some simplification procedures, some of them concerning the asymptotic behavior of the learning gain. The contribution of the paper is the study of a deterministic discrete-time (DDT) version, preserving the discrete-time form of the original network and allowing a more realistic behavior of the learning gain. As a consequence, the discrete-time nature of the new model leads to results which are drastically different to the ones known for the continuous-time formulation. Simulation examples support the presented results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
F. T. Arecchi, R. Badii and A. Politi, Low-frequency phenomena in dynamical systems with many attractors Phys. Rev. A, Vol. 29, NO. 2, pp. 1006–1009, 1984.
J. A. Berzal, P. J. Zufiria and L. Rodriguez, Implementing the Karhunen-Loeve transform via improved neural networks. Solving Engineering Problems with Neural Networks. International Conference on Engineering Applications of Neural Networks, EANN′96. pp. 375–378, London 1996. ISBN 952-90-7517-0.
H. J. Kushner & D. S. Clark, Stochastic Approximation Methods for Constrained and Unconstrained Systems New York, Springer-Verlag, 1978.
S. Haykin, Neural Networks. A comprehensive foundation Macmillan Publishing Company, 1994.
R.-W. Liu, Y.-F. Huang, and X.-T. Ling A Novel Approach to the Convergence of Neural Networks for Signal Processing. IEEE Transactions on Circuits and Systems, Vol. 42, NO. 3, 187–188, 1995.
L. Ljung, Analysis of recursive stochastic algorithms IEEE Transactions on Automatic Control AC-22, 551–575, 1977.
E. Oja, A Simplified Neuron Model as a Principal Component Analyzer Journal of Mathematical Biology, 15, 267–273, 1982.
F. Peper and H. Noda, A Symmetric Linear Neural Network That Learns Principal Components and Their Variances IEEE Transactions on Neural Networks Vol. 7, NO. 4, 1042–1047, 1996.
M. D. Plumbley, Lyapunov Functions for Convergence of Principal Component Algorithms Neural Networks, Vol. 8, NO. 1, 11–23, 1995.
H. Robbins and S. Monro, A Stochastic Approximation Method Ann. Math. Stat. 22, 400–407, 1951.
T. D. Sanger, Optimal unsupervised learning in a single-layer linear feedforward neural network Neural Networks, vol. 2, pp. 459–473, 1989.
J. Testa and G. A. Help, Study of a one-dimensional map with multiple basins Phys. Rev. A., Vol. 28, NO. 3, pp. 3085–3089, 1983.
W.-Y. Yan, U. Helmke, and J.B. Moore, Global Analysis of Oja’s Flow for Neural Networks IEEE Transactions on Neural Networks Vol. 5, NO. 5, 674–683, 1994.
Q. Zhang, and Z. Bao, Dynamical System for Computing the Eigenvectors Associated with the Largest Eigenvalue of a Positive Definite Matrix IEEE Transactions on Neural Networks Vol. 6, NO. 3, 790–791, 1995.
Q. Zhang, and Y. W. Leung, Energy function for One-Unit Oja Algorithm IEEE Transactions on Neural Networks Vol. 6, NO. 5, 1291–1293, 1995.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 1999 Springer-Verlag Wien
About this paper
Cite this paper
Zufiria, P.J. (1999). Influence of the Learning Gain on the Dynamics of Oja’s Neural Network. In: Artificial Neural Nets and Genetic Algorithms. Springer, Vienna. https://doi.org/10.1007/978-3-7091-6384-9_18
Download citation
DOI: https://doi.org/10.1007/978-3-7091-6384-9_18
Publisher Name: Springer, Vienna
Print ISBN: 978-3-211-83364-3
Online ISBN: 978-3-7091-6384-9
eBook Packages: Springer Book Archive