Abstract
This paper proposes an innovative enhancement of the classical Hopfield network algorithm (and potentially its stochastic derivatives) with an “adaptation mechanism” to guide the neural search process towards high-quality solutions for large-scale static optimization problems. Specifically, a novel methodology that employs gradient-descent in the error space to adapt weights and constraint weight parameters in order to guide the network dynamics towards solutions is formulated. In doing so, a creative algebraic approach to define error values for each neuron without knowing the desired output values for the same is adapted.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Smith, K.: Neural Networks for Combinatorial Optimization: A Review of More Than A Decade of Research. INFORMS J. on Computing. 11 (1999) 15–34
Hopfield, J. J., Tank, D. W.: Neural Computation of Decision in Optimization Problems. Biological Cybernetics. 52 (1985) 141–152
Serpen, G., Parvin, A.: On the Performance of Hopfield Network for Graph Search Problem. Int. J. Neurocomputing. 14 (1997) 365–381
Serpen, G., Livingston, D. L.: Determination of Weights for Relaxation Recurrent Neural Networks. Int. J. Neurocomputing. 34 (2000) 145–168
Pineda, F. J.: Generalization of Back-Propagation to Recurrent Neural Networks. Physical Review Letters. 59 (1987) 2229–2232
Almeida, L. B.: A Learning Rule for Asynchronous Perceptrons with Feedback in a Combinatorial Environment. Proc. of IEEE 1st Int. Conf. on Neural Networks. San Diego, CA. (1987) 609–618
Werbos, P. J.: Generalization of Backpropagation with Application to A Recurrent Gas Market Model. Neural Networks. 1 (1988) 234–242
Serpen, G., Livingston, D. L.: An Adaptive Constraint Satisfaction Network. Proc. ASILOMAR Conf. Signals, Systems and Circuits. Monterey, California. (1990) 163–167
Serpen, G., Patwardhan, A., Geib, J.: Simultaneous Recurrent Neural Network Addressing the Scaling Problem in Static Optimization. Neural Systems. 11 (2001) 477–487
OSC — Ohio Supercomputer Center, Columbus, Ohio, USA, 2002.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Serpen, G. (2003). Adaptive Hopfield Network. In: Kaynak, O., Alpaydin, E., Oja, E., Xu, L. (eds) Artificial Neural Networks and Neural Information Processing — ICANN/ICONIP 2003. ICANN ICONIP 2003 2003. Lecture Notes in Computer Science, vol 2714. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44989-2_1
Download citation
DOI: https://doi.org/10.1007/3-540-44989-2_1
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-40408-8
Online ISBN: 978-3-540-44989-8
eBook Packages: Springer Book Archive