Skip to main content

Closed Loop Stability of FIR-Recurrent Neural Networks

  • Conference paper
  • First Online:
Artificial Neural Networks and Neural Information Processing — ICANN/ICONIP 2003 (ICANN 2003, ICONIP 2003)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2714))

  • 1605 Accesses

Abstract

In this paper, the problems of stability of a general class of discrete-time delayed recurrent neural networks are re-investigated in light of some recent results. These networks are obtained by modeling synapses as Finite Impulse Response (FIR) filters instead of multiplicative scalars. We first derive a sufficient conditions for the network operating in closed-loop to converge to a fixed point using Lyapunov functional method; the symmetry of the connection matrix is not assumed. We then show how these conditions relate to other conditions ensuring both the existence of the error gradient other arbitrary long trajectories and the asymptotic stability of the fixed points at each time step.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Almeida, L.B.: Backpropagation in perceptrons with feedback. Neural Computers (1987) 199–208.

    Google Scholar 

  2. Aussem, A., Murtagh, F., Sarazin, M.: Dynamical recurrent neural networks-towards environmental time series prediction. International Journal of Neural Systems 6 (1995) 145–170

    Article  Google Scholar 

  3. Aussem, A.: Sufficient Conditions for Error Back Flow Convergence in Dynamical Recurrent Neural Networks. Neural Computation 14 (2002) 1907–1927

    Article  MATH  Google Scholar 

  4. Bengio, Y., Simard, P., Frasconi, P.: Learning long-term dependencies with gradient descent is difficult. IEEE Transactions on Neural Networks 5 (1994) 157–166.

    Article  Google Scholar 

  5. Cao, J.D., Zhou, D.M.: Stability Analysis of Delayed Cellular Neural Networks. Neural Networks 11 (1998) 1601–1605

    Article  Google Scholar 

  6. Feng, C., Plamondon, R.: On the Stability Analysis of Delayed Neural Networks. Neural Networks 14 (2001) 1181–1188

    Article  Google Scholar 

  7. Khalil, H.K.: Nonlinear Systems. Prentice-Hall, Upper Saddle River, NJ, (1996)

    Google Scholar 

  8. Kremer, S.C.: Spatiotemporal Connectionist Networks: A Taxonomy and Review. Neural Computation 13 (2001) 249–306

    Article  MATH  Google Scholar 

  9. Mandic, D.P., Chambers, J.A.: Recurrent Neural Networks for Prediction. Learning Algorithms, Architectures and Stability. John Wiley & Sons, Chichester, England (2001)

    Google Scholar 

  10. Pineda, F.J.: Generalization of back-propagation to recurrent neural networks. Physical Review Letters 59 (1987) 2229–2232

    Article  MathSciNet  Google Scholar 

  11. Wan, E.A.: Finite Impulse Response Neural Networks with Applications in Time Series Prediction. Ph.D. Thesis, Stanford University, CA, (1993)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Aussem, A. (2003). Closed Loop Stability of FIR-Recurrent Neural Networks. In: Kaynak, O., Alpaydin, E., Oja, E., Xu, L. (eds) Artificial Neural Networks and Neural Information Processing — ICANN/ICONIP 2003. ICANN ICONIP 2003 2003. Lecture Notes in Computer Science, vol 2714. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44989-2_62

Download citation

  • DOI: https://doi.org/10.1007/3-540-44989-2_62

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-40408-8

  • Online ISBN: 978-3-540-44989-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics