Abstract
In this paper, we introduce a new concept of enhancement and relaxation to discover features in input patterns in competitive learning. We have introduced mutual information to realize competitive processes. Because mutual information is an average over all input patterns and competitive units, it cannot be used to detect detailed feature extraction. To examine in more detail how a network is organized, we introduce the enhancement and relaxation of competitive units through some elements in a network. With this procedure, we can estimate how the elements are organized with more detail. We applied the method to a simple artificial data and the famous Iris problem to show how well the method can extract the main features in input patterns. Experimental results showed that the method could more explicitly extract the main features in input patterns than the conventional techniques of the SOM.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Gokcay, E., Principe, J.: Information theoretic clustering. IEEE Transactions on Pattern Analysis and Machine 24(2), 158–171 (2002)
Lehn-Schioler, D.E.T., Hegde, A., Principe, J.C.: Vector-quantization using information theoretic concepts. Natural Computation 4(1), 39–51 (2004)
Torkkola, K.: Feature extraction by non-parametric mutual information maximization. Journal of Machine Learning Research 3, 1415–1438 (2003)
Linsker, R.: Self-organization in a perceptual network. Computer 21, 105–117 (1988)
Linsker, R.: How to generate ordered maps by maximizing the mutual information between input and output. Neural Computation 1, 402–411 (1989)
Bell, A.J., Sejnowski, T.J.: An information-maximization approach to blind separation and blind deconvolution. Neural Computation 7(6), 1129–1159 (1995)
Kamimura, R.: Information-theoretic competitive learning with inverse euclidean distance. Neural Processing Letters 18, 163–184 (2003)
Kamimura, R.: Unifying cost and information in information-theoretic competitive learning. Neural Networks 18, 711–718 (2006)
Mozer, M.C., Smolensky, P.: Using relevance to reduce network size automatically. Connection Science 1(1), 3–16 (1989)
Karnin, E.D.: A simple procedure for pruning back-propagation trained neural networks. IEEE Transactions on Neural Networks 1(2) (1990)
Le Cun, J.S.D.Y., Solla, S.A.: Optimal brain damage. In: Advanced in Neural Information Processing, pp. 598–605 (1990)
Reed, R.: Pruning algorithms-a survey. IEEE Transactions on Neural Networks 4(5) (1993)
Rumelhart, D.E., Zipser, D.: Feature discovery by competitive learning. Cognitive Science 9, 75–112
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kamimura, R. (2008). Feature Discovery by Enhancement and Relaxation of Competitive Units. In: Fyfe, C., Kim, D., Lee, SY., Yin, H. (eds) Intelligent Data Engineering and Automated Learning – IDEAL 2008. IDEAL 2008. Lecture Notes in Computer Science, vol 5326. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-88906-9_19
Download citation
DOI: https://doi.org/10.1007/978-3-540-88906-9_19
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-88905-2
Online ISBN: 978-3-540-88906-9
eBook Packages: Computer ScienceComputer Science (R0)