Skip to main content
Log in

On the stability of neural networks with arbitrary weights

  • Articles
  • Published:
Neural Computing & Applications Aims and scope Submit manuscript

Abstract

Dynamical properties of a general neural network are discussed. A condition for the neural activation dynamics being stable is derived. The stability condition does not put any symmetry restriction on the weights. The transitions from stable to non-stable dynamics are analysed and their analogy to phase transitions in statistical mechanics discussed. In general, the network dynamics can violate the stability condition. To avoid that happening, a method is introduced which makes the dynamics adaptive such that the stability condition is sustained. Relations to some previous works on stability are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Hopfield JJ. Neural networks and physical systems with emergent collective computational abilities. Proc Nat Acad Sci USA 1982; 79: 2554–2558.

    Google Scholar 

  2. Michel AN, Farell JA. Associative memories via artificial neural networks. IEEE Control Systems Mag April 1990; 6

  3. Jordan DW, Smith PS. Nonlinear Ordinary Differential Equations. Clarendon Press, Oxford, 1989.

    Google Scholar 

  4. Pineda FJ. Generalisation of back-propagation to recurrent neural networks. Phys Rev Lett 1987; 59 (2229)

    Google Scholar 

  5. Hirsch MW. Convergent activation dynamics in continuous time networks. Neural Networks 1989; 2: 331–349

    Google Scholar 

  6. Cohen M, Grossberg S. Absolute stability of global pattern formation and parallel memory storage by competitive neural networks. IEEE Trans Systems Man Cybern 19??; 13: 815–826

  7. Hopfield JJ. Neurons with graded response have collective computational properties like those of twostate neurons. Proc Nat Acad Sci USA 1984; 81: 3088–3092

    Google Scholar 

  8. Almeida LB. Backpropagation in non-feedforward networks. In: I Aleksander, ed., Neural Computing Architectures. North Oxford Academic, London, 1989.

    Google Scholar 

  9. Lapedes A, Farber R. A self optimising, nonsymmetrical neural net for content addressable memory and pattern recognition. Physica 1986; 22D: 247–259

    Google Scholar 

  10. Guez A, Protopopescu V, Barhen J. On the stability, storage capacity, and design of non-linear continuous neural networks. IEEE Trans Systems Man Cybern 1988; 18: 80–87

    Google Scholar 

  11. Matsuoka K. Stability conditions for non-linear continuous neural networks with asymmetric connection weights. Neural Networks 1992; 5: 495–500

    Google Scholar 

  12. Noble B, Daniel J. Applied Linear Algebra. Prentice Hall, Englewood Cliffs, NJ, 1988.

    Google Scholar 

  13. Olafsson S. Some dynamical properties of neural networks. In: R Lingardet al., eds., Neural Networks for Vision, Speech and Natural Language. Chapman & Hall, London, 1992

    Google Scholar 

  14. Brauer A. Limits for the characteristic roots of a matrix. Duke Mathematical J 1946; 13: 387–395

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Olafsson, S. On the stability of neural networks with arbitrary weights. Neural Comput & Applic 4, 2–9 (1996). https://doi.org/10.1007/BF01413864

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01413864

Keywords

Navigation