Abstract
In this paper, a local distribution neural network is proposed for data clustering. This competing network is designed for non-stationary and evolving environment. It represents data by means of neurons (ellipsoids) arranged on a topology map. The local distribution is stored in ellipsoids, while the global topology information is preserving in the relationship between adjacent ellipsoids. With a self-adapting threshold strategy and iteratively learning for information of local distribution, the algorithm is operated in an incremental and on-line way. During implementation, The adopted metric is an improved Mahalanobis distance which considers the local distribution and implies the anisotropy on different vector basis. Hence it can be interpreted as an incremental version of Gaussian mixture model. Experiments both on artificial data and real-world data are carried out to show the performance of the proposed method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Kohonen, T.: The self-organizing map. Proceedings of the IEEE 78, 1464–1480 (1990)
Kohonen, T.: The adaptive-subspace SOM (ASSOM) and its use for the implementation of invariant feature detection. In: Fogelman-Soulié, F., Galniari, P. (eds.) Proceedings of the ICANN 1995, International Conference on Artificial Neural Networks, Paris: EC2 and Cie, vol. 1, pp. 3–10 (1995)
López-Rubio, E., Muñoz-Pérez, J., Gómez-Ruiz, J.A.: A pricipal components analysis self-organizing map. Neural Networks 17, 261–270 (2004)
Shen, F., Hasegawa, O.: An incremental network for on-line unsupervised classification and topology learning. Neural Networks 19, 90–106 (2006)
Arnonkijpanich, B., Hammer, B., Hasenfuss, A.: Local Matrix Adaptation in Topographic Neural Maps
Yin, H., Huang, W.: Adaptive nonlinear manifolds and their applications to pattern recognition. Information Sciences 180, 2649–2662 (2010)
Carpenter, G.A., Grossberg, S.: The ART of adaptive pattern recognition by a self-organizing neural network. IEEE Computer 21, 77–88 (1988)
Kambhatla, N., Leen, T.K.: Dimension reduction by local principal component analysis. Neural Computation 9(7), 1493–1516 (1997)
Tipping, M.E., Bishop, C.M.: Mixtures of probabilistic principal component analyzers. Neural Computation 11(2), 443–482 (1999)
Weingessel, A., Hornik, K.: Local PCA algorithms. IEEE Trans. Neural Networks 11(6), 1242–1250 (2000)
Reilly, L., Cooper, L.N., Elbaum, C.: A Neural Model for Category Learning. Biological Cybernetics 45, 35–41 (1982)
Wasserman, P.D.: Advanced Methods in Neural Computing. Van Nostrand Reinhold, New York (1993)
Maesschalck, R., Jouan-Rimbaud, D., Massart, D.L.: The Mahalanobis distance. Chemometrics and Intelligent Laboratory Systems 50, 1–18 (2000)
Figueiredo, M.A.T., Jain, A.K.: Unsupervised Learning of Finite Mixture Models. IEEE Transactions on Pattern Analysis and Machine Intelligence 24, 381–396 (2002)
Shen, F., Hasegawa, O.: An enhanced self-organizing incremental neural network for online unsupervised learning. Neural Networks 20, 893–903 (2007)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ouyang, Q., Shen, F., Zhao, J. (2012). A Local Distribution Net for Data Clustering. In: Anthony, P., Ishizuka, M., Lukose, D. (eds) PRICAI 2012: Trends in Artificial Intelligence. PRICAI 2012. Lecture Notes in Computer Science(), vol 7458. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32695-0_37
Download citation
DOI: https://doi.org/10.1007/978-3-642-32695-0_37
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-32694-3
Online ISBN: 978-3-642-32695-0
eBook Packages: Computer ScienceComputer Science (R0)