Skip to main content
Log in

Weight Initialization for Simultaneous Recurrent Neural Network Trained with a Fixed-point Learning Algorithm

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

This letter presents a study of the Simultaneous Recurrent Neural network, an adaptive algorithm, as a nonlinear dynamic system for static optimization. Empirical findings, which were recently reported in the literature, suggest that the Simultaneous Recurrent Neural network offers superior performance for large-scale instances of combinatorial optimization problems in terms of desirable convergence characteristics improved solution quality and computational complexity measures. A theoretical study that encompasses exploration of initialization properties of the Simultaneous Recurrent Neural network dynamics to facilitate application of a fixed-point training algorithm is carried out. Specifically, initialization of the weight matrix entries to induce one or more stable equilibrium points in the state space of the nonlinear network dynamics is investigated and applicable theoretical bounds are derived. A simulation study to confirm the theoretical bounds on initial values of weights is realized. Theoretical findings and correlating simulation study performed suggest that the Simultaneous Recurrent Neural network dynamics possesses desirable stability characteristics as an adaptive recurrent neural network for addressing static optimization problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Ackley, D. H., Hinton, G. E. and Sejnowski, T. J.: A Learning Algorithm for Boltzman Machine, Cognitive Science, 9 (1985), 147–169.

    Article  Google Scholar 

  • Almeida, L. B.: A learning rule for asynchronous perceptrons with feedback in a combinatorial environment, Proceeding of IEEE 1st International Conference on Neural Networks, San Diego, CA, June 21–24, (1987), pp. 609–618.

  • Atiya, A. F. Learning on a General Network, Neural Information Processing Systems, New York: American Institute of Physics, (1988), pp. 22–30.

    Google Scholar 

  • Cichocki, A. and Unbehauen, R.: Neural Network for Optimization and Signal Processing, Wiley, 1993.

  • Hopfield, J. J. and Tank, D. W.: Neural Computation of Decision in Optimization Problems, Biological Cybernetics, 52 (1985), 141–152.

    MATH  MathSciNet  Google Scholar 

  • Pineda, F. J.: Generalization of Back-Propagation to Recurrent Neural Networks, Physical Review Letters, 59 (1987), 2229–2232.

    Article  MathSciNet  ADS  Google Scholar 

  • Serpen, G., Patwardhan, A. and Geib, J.: Addressing the Scaling Problem of Neural Networks in Static Optimization, International Journal of Neural Systems, 11(5) (2001), 477–487.

    Google Scholar 

  • Serpen, G. and Livingston, D. L.: Determination of Weights for Relaxation Recurrent Neural Networks, Neurocomputing, 34 (2000), 145–168.

    Article  MATH  Google Scholar 

  • Serpen, G. and Corra, J.: Training Simultaneous Recurrent Neural Network with Resilient Propagation for Combinatorial Optimization, to appear in International Journal of Neural Systems, 2002.

  • Smith, K.: Neural Networks for Combinatorial Optimization: A Review of More Than A Decade of Research, INFORMS Journal on Computing, 11(1) (1999), 15–34.

    Article  MATH  MathSciNet  Google Scholar 

  • Werbos, P. J.: Generalization of Backpropagation with Application to A Recurrent Gas Market Model, Neural Networks, 1(4) (1988), 234–242.

    Article  Google Scholar 

  • Werbos, P. J. and Pang, X.: Generalized Maze Navigation: SRN Critic Solve What Feedforward or Hebbian Nets Cannot, Proceedings of World Congress on Neural Networks, San Diego, CA, pp. 88–93, September 1996.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Serpen, G., Xu, Y. Weight Initialization for Simultaneous Recurrent Neural Network Trained with a Fixed-point Learning Algorithm. Neural Processing Letters 17, 33–41 (2003). https://doi.org/10.1023/A:1022921127061

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1022921127061

Navigation