Abstract
Initialization of self-organizing maps is typically based on random vectors within the given input space. The implicit problem with random initialization is the overlap (entanglement) of connections between neurons. In this paper, we present a new method of initialization based on a set of self-similar curves known as Hilbert curves. Hilbert curves can be scaled in network size for the number of neurons based on a simple recursive (fractal) technique, implicit in the properties of Hilbert curves. We have shown that when using Hilbert curve vector (HCV) initialization in both classical SOM algorithm and in a parallel-growing algorithm (ParaSOM), the neural network reaches better coverage and faster organization.






















Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Abdulkader H, Ibnkahla M (2000) Convergence properties of self-organizing maps applied for communication channel equalization. In: IEEE international conference on acoustics, speech and signal processing, pp 3502–3505
Bai Y, Zhang W, Jin Z (2006) A new self-organizing maps strategy for solving the traveling salesman problem, chaos, solutions and fractals. Elsevier, Amsterdam, pp 1082–1089
Bear M, Connors B, Paradiso M (2006) Neuroscience, exploring the brain, 3rd edn. Lippincott, Williams, and Wilkins publishing, USA
Bechtel W, Abrahamsen A (2001) Connectionism and the mind, 2nd edn. Blackwell publishers, Oxford
Changeux JP (1983) L’Homme Neuronal, Fayard
Churchland P, Sejnowski T (1992) The computational brain. MIT Press, Cambridge
Deaton R, Tang L, Reddick WE (1994) Fractal analysis of magnetic resonance images of the brain. In: International conference of IEEE engineering in medicine and biology, pp 616–617
Erwin E, Obermayer K, Schulten K (1997) Self-organizing maps: ordering, convergence properties and energy functions. Biol Cybern 67:47–55
Fritzke B (1994) Growing cell structures—a self-organizing network for unsupervised and supervised learning. Neural Netw 7(9):1441–1460
Fritzke B (1995) Growing grid—a self-organizing network with constant neighborhood range and adaptation strength. Neural Process Lett 2(5):9–13
Hammond J, MacLean D, Valova I (2006) A parallel implementation of a growing SOM promoting independent neural networks over distributed input space. In: Proceedings international joint conference on neural networks, pp 1937–1944
Haykin S (1999) Neural networks: a comprehensive foundation, 2nd edn. Prentice Hall, Englewood Cliffs
Hollmen J, Tresp V, Simula O (2000) A learning vector quantization algorithm for probabilistic models. EUSIPCO II:721–724
Kennedy J, Eberhart RC (1995) Particle swarm optimization. In: Proceedings of IEEE international conference on neural networks, pp 1942–1948
Kohonen T (1995) Self-organizing maps, 2nd edn. Springer, New York
Lagus K, Kaski S, Kohonen T (2004) Mining massive document collections by the WEBSOM method. Inf Sci 163(1–3):135–156
MacLean D, Valova I (2007) Parallel growing SOM monitored by genetic algorithm. In: Proceedings international joint conference on neural networks, pp 1697–1702
Mandelbrot BB (1982) The fractal geometry of nature, chap 7, harnessing the Peano Monster curves. W. H. Freeman, San Francisco
Peitgen HO, Jurgens H, Saupe D (2004) Chaos and fractals: new frontiers of science. Springer, New York
Polani D (2002) Measures for the organization of self-organizing maps. In: Seiffert U, Jain LC (eds) Self-organizing neural networks. Physica-Verlag, Wurzburg, pp 13–44
deRidder D, Duin RPW (1997) Sammon’s mapping using neural networks: a comparison. Pattern Recogn Lett 18:1307–1316
Ritter H, Schulter K (1988) Kohonen’s self-organizing feature maps: exploring their computational capabilities. In: Proceedings of IEEE international conference on neural networks, vol 1, pp 109–116
Sagan H (1994) Space-filling curves. Springer, New York
Sammon JW (1969) A nonlinear mapping for data structure analysis. IEEE Trans Comput C-18:401–409
Su MC, Chang HT (2000) Fast self-organizing feature map algorithm. IEEE Trans Neural Netw 11:721–733
Su MC, Liu TK, Chang HT (1999) An efficient initialization scheme for the self-organizing feature map algorithm. In: Proceedings IJCNN, pp 1906–1910
Torgerson WS (1952) Multidimensional scaling I—theory and method. Psychometrika 17:401–419
Valova I, Szer D, Gueorguieva N, Buer A (2005) A parallel growing architecture for self-organizing maps with unsupervised learning. Neurocomputing 68:177–195
Valova I, Beaton D, Buer A, MacLean D (2008) CQoCO: a Measure for Comparative Quality of Coverage and organization for Self-Organizing Maps, in review for Neural Networks, submitted April 2008
Zhang WD, Bai YP, Hu HP (2006) The incorporation of an efficient initialization method and parameter adaptation using self-organizing maps to solve TSP, Applied Mathematics and Computation, Elsevier, pp 603–623
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Valova, I., Beaton, D., Buer, A. et al. Fractal initialization for high-quality mapping with self-organizing maps. Neural Comput & Applic 19, 953–966 (2010). https://doi.org/10.1007/s00521-010-0413-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-010-0413-5