Skip to main content
Log in

A higher order Hopfield network for vector quantisation

  • Articles
  • Published:
Neural Computing & Applications Aims and scope Submit manuscript

Abstract

A higher order version of the Hopfield neural network is presented which will perform a simple vector quantisation or clustering function. This model requires no penalty terms to impose constraints in the Hopfield energy, in contrast to the usual one where the energy involves only terms quadratic in the state vector. The energy function is shown to have no local minima within the unit hypercube of the state vector so the network only converges to valid final states. Optimisation trials show that the network can consistently find optimal clusterings for small, trial problems and near optimal ones for a large data set consisting of the intensity values from a digitised, grey- level image.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Hopfield JJ, Tank DW. Neural computation of decisions in optimization problems. Biol Cybernetics 1985; 52: 141–152

    Google Scholar 

  2. Wilson GV, Pawley PS. On the stability of the Traveling Salesman Problem algorithm of Hopfield and Tank. Biol Cybernetics 1988; 58: 63–70

    Google Scholar 

  3. Aiyer SVB, Niranjan M, Fallside F. A theoretical investigation into the performance of the Hopfield model. IEEE Trans Neural Networks 1990; 1: 204–215

    Google Scholar 

  4. Ramanujam J, Sadayappan P. Optimisation by neural networks. Proc IEEE Int Conf Neural Networks, San Diego, CA, 1988; 2: 325–332

    Google Scholar 

  5. Brandt RD, Wang Y, Laub AJ, Mitra SK. Alternative networks for solving the Traveling Salesman Problem and the List-Matching Problem. Proc IEEE Int Conf Neural Networks, San Diego, CA, 1988; 2: 333–340

    Google Scholar 

  6. Peterson C, Soderberg B. A new method for mapping optimization problems onto neural networks. Int J Neural Systems 1989; 1: 3–22

    Google Scholar 

  7. Xu X, Tsai WT. Effective neural algorithms for the Traveling Salesman Problem. Neural Networks 1991; 4: 193–205

    Google Scholar 

  8. Gray RM. Vector quantization. IEEE ASSP Magazine 1984; 1: 4–29

    Google Scholar 

  9. Van den Bout DE, Miller TK. A traveling salesman objective function that works. Proc IEEE Int Conf Neural Networks, San Diego, CA, 1988; 2: 299–303

    Google Scholar 

  10. Hegde SU, Sweet JL, Levy WB. Determination of parameters in a Hopfield/Tank computational network. Proc IEEE Int Conf Neural Networks, San Diego, CA, 1988; 2: 291–298

    Google Scholar 

  11. Bow Sing-Tze. Pattern Recognition: Applications to Large Data-Set Problems. Marcel Dekker, 1984

  12. Qiu G, Varley MR, Ferrell TJ. Improved block truncation coding using a Hopfield neural network. Electr Lett 1991; 27(21): 1924–1926

    Google Scholar 

  13. Hopfield JJ. Neurons with graded response have collective computational properties like those of two-state neurons. Proc Nat Acad Sci USA 1984; 81: 3088–3092

    Google Scholar 

  14. Goldberg DE. Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley, 1989

  15. Fisher WD. On grouping for maximum homogeneity. J Am Stat Assoc 1958; 53

  16. Hartigan JA. Clustering Algorithms. John Wiley and Sons, 1975

  17. Mehta S, Fulop L. An analog neural network to solve the Hamiltonian Cycle Problem. Neural Networks 1993; 6: 869–881

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Soper, A. A higher order Hopfield network for vector quantisation. Neural Comput & Applic 7, 99–106 (1998). https://doi.org/10.1007/BF01414161

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01414161

Keywords

Navigation