Abstract
In this paper, we propose structural enhanced information for detecting main features in input patterns. In structural enhanced information, three types of enhanced information can be differentiated, that is, the first-, the second- and the third-order enhanced information. The first-order information is related to the enhancement of competitive units themselves through some elements in a network, and the second-order information is dependent upon the enhancement of competitive units with input patterns. Then, the third-order information is obtained by subtracting the effect of the first-order information from the second-order information. Thus, the third-order information more explicitly represents information on input patterns. With this structural enhanced information, we can estimate more detailed features in input patterns. We applied the method to the well-known Iris problem. In both problems, we succeeded in extracting detailed and important features especially by using the third-order information.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Mozer, M.C., Smolensky, P.: Using relevance to reduce network size automatically. Connection Science 1(1), 3–16 (1989)
Karnin, E.D.: A simple procedure for pruning back-propagation trained networks. IEEE transactions on neural networks 1(2), 239–242 (1990)
Reed, R.: Pruning algorithms-a survey. IEEE Transactions on Neural Networks 4(5) (1993)
DeSieno, D.: Adding a conscience to competitive learning. In: Proceedings of IEEE International Conference on Neural Networks, San Diego, pp. 117–124. IEEE, Los Alamitos (1988)
Ahalt, S.C., Krishnamurthy, A.K., Chen, P., Melton, D.E.: Competitive learning algorithms for vector quantization. Neural Networks 3, 277–290 (1990)
Xu, L.: Rival penalized competitive learning for clustering analysis, RBF net, and curve detection. IEEE Transaction on Neural Networks 4(4), 636–649 (1993)
Luk, A., Lien, S.: Properties of the generalized lotto-type competitive learning. In: Proceedings of International conference on neural information processing, San Mateo, CA, pp. 1180–1185. Morgan Kaufmann Publishers, San Francisco (2000)
Hulle, M.M.V.: The formation of topographic maps that maximize the average mutual information of the output responses to noiseless input signals. Neural Computation 9(3), 595–606 (1997)
Kaski, S., Nikkilä, J., Kohonen, T.: Methods for interpreting a self-organized map in data analysis. In: Verleysen, M. (ed.) Proceedings of ESANN 1998, 6th European Symposium on Artificial Neural Networks, Bruges, April 22–24, 1998, pp. 185–190. D-Facto, Brussels (1998)
Vesanto, J., Alhoniemi, E.: Clustering of the self-organizing map. IEEE-NNÂ 11, 586 (2000)
Vesanto, J.: SOM-based data visualization methods. Intelligent-Data-Analysis 3, 111–126 (1999)
Torkkola, K.: Feature extraction by non-parametric mutual information maximization. Journal of Machine Learning Research 3, 1415–1438 (2003)
Slonim, N., Tishby, N.: Agglomerative information bottleneck (1999)
Linsker, R.: Self-organization in a perceptual network. Computer 21, 105–117 (1988)
Kamimura, R., Kamimura, T., Uchida, O.: Flexible feature discovery and structural information. Connection Science 13(4), 323–347 (2001)
Kamimura, R.: Information-theoretic competitive learning with inverse euclidean distance. Neural Processing Letters 18, 163–184 (2003)
Rumelhart, D.E., Zipser, D.: Feature discovery by competitive learning. Cognitive Science 9, 75–112 (1985)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kamimura, R. (2009). Feature Detection by Structural Enhanced Information. In: Köppen, M., Kasabov, N., Coghill, G. (eds) Advances in Neuro-Information Processing. ICONIP 2008. Lecture Notes in Computer Science, vol 5507. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-03040-6_21
Download citation
DOI: https://doi.org/10.1007/978-3-642-03040-6_21
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-03039-0
Online ISBN: 978-3-642-03040-6
eBook Packages: Computer ScienceComputer Science (R0)