Skip to main content
Log in

Clustering properties of hierarchical self-organizing maps

  • Published:
Journal of Mathematical Imaging and Vision Aims and scope Submit manuscript

Abstract

A multilayer hierarchical self-organizing map (HSOM) is discussed as an unsupervised clustering method. The HSOM is shown to form arbitrarily complex clusters, in analogy with multilayer feedforward networks. In addition, the HSOM provides a natural measure for the distance of a point from a cluster that weighs all the points belonging to the cluster appropriately. In experiments with both artificial and real data it is demonstrated that the multilayer SOM forms clusters that match better to the desired classes than do direct SOM's, classical k-means, or Isodata algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. R. Hecht-Nielsen, “Theory of backpropagation neural network,” in Proc. IEEE Int. Joint Conf. on Neural Networks, vol. I, Washington, DC, 1989, pp. 593–611.

  2. R. Hecht-Nielsen, Neurocomputing, Addison-Wesley: Reading, MA, 1990.

    Google Scholar 

  3. K. Hornik, M. Stinchcombe, and H. White, “Multi-layer feedforward networks are universal approximators,” Neural Net., vol. 2, pp. 359–366, 1989.

    Google Scholar 

  4. M.D. Richard and R.P. Lippmann, “Neural network classifiers estimate Bayesian a posteriori probabilities,” Neural Comput., vol. 3, 1991, pp. 461–483.

    Google Scholar 

  5. T. Kohonen, Self-Organization and Associative Memory, Springer-Verlag: Berlin, 1989.

    Google Scholar 

  6. T. Kohonen, “Self-organized formation of topologically correct feature maps,” Biol. Cybernet., vol. 43, pp. 59–69, 1982.

    Google Scholar 

  7. S. Amari, “Topographic organization of nerve fields,” Bull. Math. Biol., vol. 42, pp. 339–364, 1980.

    Google Scholar 

  8. H. Ritter and K. Schulten, “Kohonen's self-organizing maps: exploring their computational capabilities,” in Proc. IEEE Int. Joint Conf. on Neural Networks, vol. 1, San Diego, CA, 1988, pp. 109–116.

  9. S.P. Luttrell, “Self-organisation: A derivation from first principles of a class of learning algorithms,” Proc. IEEE Int. Joint Conf. on Neural Networks, vol. 2, Washington, DC, 1989, pp. 495–498.

    Google Scholar 

  10. T. Kohonen, “Self-organizing maps: optimazation approaches,” in Artificial Neural Networks, vol. 2, T. Kohonen, K. Mäkisara, J. Kangas, and O. Simula, eds., North-Holland: Amsterdam, 1991, pp. 981–990.

    Google Scholar 

  11. P.A. Devijver and J. Kittler, Pattern Recognition: A Statistical Approach, Prentice Hall: London, 1982.

    Google Scholar 

  12. J.C. Gower and G.J.S. Ross, “Minimum spanning trees and single linkage cluster analysis,” Appl. Statist., vol. 18, pp. 54–64, 1969.

    Google Scholar 

  13. S.P. Luttrell, “Image compression using a multilayer neural network,” Pattern Recog. Lett., vol. 10, pp. 1–7, 1989.

    Google Scholar 

  14. J. Lampinen, “Distortion tolerant pattern recognition using invariant transformations and hierarchical SOFM clustering,” in Artificial Neural Networks, vol. 1, T. Kohonen, K. Mäkisara, J. Kangas, and O. Simula, eds., North-Holland: Amsterdam, 1991, pp. 99–104.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lampinen, J., Oja, E. Clustering properties of hierarchical self-organizing maps. J Math Imaging Vis 2, 261–272 (1992). https://doi.org/10.1007/BF00118594

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00118594

Key words

Navigation