Abstract
In this paper, we present a method of entropy minimization for competitive learning with winner-take-all activation rule. In the competitive learning, only one unit is turned on as a winner, while all the other units are off as losers. Thus, the learning is mainly considered to be a process of entropy minimization. If entropy in competitive layer is minimized, only one unit is on, while all the other units are turned off. If entropy is maximized, all the units are equally activated.
We applied this method of entropy minimization to two problems: autoencoder as feature detector and the organization of internal representation: the estimation of well-formedness of English sentences. For an autoencoder, we observed that networks with entropy method could classify four input patterns into two categories clearly. For a sentence well-formedness problem, a feature of input patterns was explicitly seen in competitive hidden layer. In other words, explicit internal representation could be obtained. In two cases, multiple inhibitory connections were observed to be produced. Thus, entropy minimization method is completely equivalent to competitive learning approaches through mutual inhibition. Entropy minimization method is more simple and easy to calculate. In the formulation and experiments, supervised learning (autoencoder) was used. However, the entropy method can be extended to fully unsupervised learning, which may replace ordinary competitive learning with winner-take-all activation rule.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
H. B. Barlow, T. P. Kaushal and G. J. Mitchison, “Finding minimum entropy codes,” Neural Computing, vol.1, pp.412–423, 1989.
S. Becker, “Unsupervised learning procedures for neural networks,” International Journal of Neural Systems, Vol.2, No.1, pp.17–33, 1991.
G. A. Carpenter and S. Grossberg, “The ART of adaptive pattern recognition by a self-organizing neural network,” Computer, March, pp. 77–88, 1988.
J. E. Dayhoff, Neural Network Architectures, New York: Van Nostrand Reinhold, 1990.
T. Kohonen, Self-Organization and Associated Memory, New York: Springer-Verlag, 1988.
R. Hecht-Nielsen, Neurocomputing, Addison-Wesley Publishing Company, 1989.
J. Hertz, A. Krough and R. G. Palmer, Introduction to the Theory of Neural Computation, Redwood City, CA: Addison-Wesley, 1991.
R. Kamimura, “Acquisition of the grammatical competence with recurrent neural networks, ” in Artificial Neural Networks, T. Kohonen, K. Makisara, O. Simula, and J. Kangas, Ed, Amsterdam, The Netherlands, Vol.1, 1991, pp.903–908.
R. Kamimura, “Minimum entropy method in neural networks,” Research Report, Information Science Laboratory, Tokai University, ISL-RR-92-05, 1992.
F.J. Pineda, “Generalization of back-propagation to recurrent neural networks,” Physical Review Letters, Vol.59, No.19, pp. 2229–2232, 1987.
D. E. Rumelhart and D. Zipser, “Feature discovery by competitive learning,” in Parallel Distributed Processing, D. E. Rumelhart, J. L. McClelland, and the PDP Research Group, Cambridge, Massachusetts: the MIT press, Vol.1, pp.318–362, 1986.
H. Uchida and M. Ishikawa, “A structural learning of neural networks based on entropy criterion” (in Japanese), IEICE Technical Report, The Institute of Electronics, Information and Communication Engineers, Vol.91, No.530, pp. 161–167, 1992.
W. Zhang, A. Hasegawa, K. Itoh and Y. Ichioka, “Error back propagation with minimum-entropy weights: A technique for better generalization of 2-D Shift-Invariant NNs,” in Proceedings of International Joint Conference on Neural Networks, (Seattle, WA), Vol.1, 1991, pp.645–648.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1993 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kamimura, R. (1993). Competitive learning by entropy minimization. In: Doshita, S., Furukawa, K., Jantke, K.P., Nishida, T. (eds) Algorithmic Learning Theory. ALT 1992. Lecture Notes in Computer Science, vol 743. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-57369-0_32
Download citation
DOI: https://doi.org/10.1007/3-540-57369-0_32
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-57369-2
Online ISBN: 978-3-540-48093-8
eBook Packages: Springer Book Archive