Skip to main content
Log in

The Object Perceptron Learning Algorithm on Generalised Hopfield Networks for Associative Memory

  • Published:
Neural Computing & Applications Aims and scope Submit manuscript

Abstract

We present a study of generalised Hopfield networks for associative memory. By analysing the radius of attraction of a stable state, the Object Perceptron Learning Algorithm (OPLA) and OPLA scheme are proposed to store a set of sample patterns (vectors) in a generalised Hopfield network with their radii of attraction as large as we require. OPLA modifies a set of weights and a threshold in a way similar to the perceptron learning algorithm. The simulation results show that the OPLA scheme is more effective for associative memory than both the sum-of-outer produce scheme with a Hopfield network and the weighted sum-of-outer product scheme with an asymmetric Hopfield network.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+
from $39.99 /Month
  • Starting from 10 chapters or articles per month
  • Access and download chapters and articles from more than 300k books and 2,500 journals
  • Cancel anytime
View plans

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ma, J. The Object Perceptron Learning Algorithm on Generalised Hopfield Networks for Associative Memory. NCA 8, 25–32 (1999). https://doi.org/10.1007/s005210050004

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s005210050004