Abstract
Recurrent Backpropagation schemes for fixed point learning in continuous-time dynamic neural networks can be formalized through a differential-algebraic model, which in turn leads to singularly perturbed training techniques. Such models clarify the relative time-scaling between the network evolution and the adaptation dynamics, and allow for rigorous local convergence proofs. The present contribution addresses some related issues in a discrete-time context: fixed point problems can be analyzed in terms of iterations with different evolution rates, whereas periodic trajectory learning can be reduced to a multiple fixed point learning problem via Poincaré maps.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Allgower, E. L., Georg, K.: Numerical Continuation Methods. An Introduction. Springer Verlag, 1990
Amann, H.: Ordinary Differential Equations. Walter de Gruyter, 1990
Baldi, T.: Gradient descent learning algorithm overview: A general dynamical systems perspective. IEEE Trans. Neural Networks 6 (1995) 182–195
Brenan, K. E., Campbell, S. L., Petzold, L. R.: Numerical Solution of Initial-Value Problems in Differential-Algebraic Equations. SIAM, 1996
Cohen, M. A., Grossberg, S.: Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Trans. Systems, Man and Cybernetics 13 (1983) 815–826
Jin, L., Gupta, M. M.: Stable dynamic backpropagation learning in recurrent neural networks. IEEE Trans. Neural Networks 10 (1999) 1321–1334
Pineda, F. J.: Generalization of back-propagation to recurrent neural networks. Phys. Rev. Let. 59 (1987) 2229–2232
Riaza, R., Campbell, S. L., Marszalek, W.: On singular equilibria of index-1 DAEs. Circuits, Systems and Signal Processing 19 (2000) 131–157
Riaza, R., Zufiria, P.J.: Rates of learning in gradient and genetic training of recurrent neural networks, in A. Dobnikar et al, eds., Artificial Neural Nets and Genetic Algorithms (ICANNGA’99), pp. 95–99. Springer Computer Science, 1999
Saberi, A., Khalil, H.: Quadratic-type Lyapunov functions for singularly perturbed systems. IEEE Trans. Aut. Cont. 29 (1984) 542–550
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Riaza, R., Zufiria, P.J. (2002). Time-Scaling in Recurrent Neural Learning. In: Dorronsoro, J.R. (eds) Artificial Neural Networks — ICANN 2002. ICANN 2002. Lecture Notes in Computer Science, vol 2415. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46084-5_221
Download citation
DOI: https://doi.org/10.1007/3-540-46084-5_221
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44074-1
Online ISBN: 978-3-540-46084-8
eBook Packages: Springer Book Archive