Abstract
Multilayer perceptrons can be trained with several algorithms and with different quantities that correlate the expected output and the achieved state. Among the most common of those quantities is the mean square error, but information-theoretic quantities have been applied with great success. A common scheme to train multilayer perceptrons is based in evolutionary computing, as a counterpart of the commonly applied backpropagation algorithm. In this contribution we evaluated the performance of multilayer perceptrons as classifiers when trained with genetic algorithms and applying mutual information between the label obtained by the network and the expected class. We propose a classification algorithm in which each input variable is substituted by a function of it such that mutual information from the new function to the label is maximized. Next, those approximated functions are fed as input to a multilayer perceptron in charge of learning the classification map, trained with genetic algorithms and guided by mutual information.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Rojas, R.: Neural networks, a systematic introduction. Springer (1996)
Principe, J.C. (ed.): Information Theoretic Learning: Renyi’s Entropy and Kernel Perspectives. Springer (2010)
Xu, D., Principe, J.C.: Training MLPs layer-by-layer with the information potential. In: Intl. Joint Conf. on Neural Networks, Washington, vol. 8, pp. 1716–1720 (1999)
Shannon, C.E.: A Mathematical Theory of Communication. Bell System Technical Journal 27, 379–423, 623–656 (July & October 1948)
Cellucci, C.J., Albano, A.M., College, B., Rapp, P.E.: Statistical Validation of Mutual Information Calculations: Comparison of Alternative Numerical Algorithms. Physical Review E 71(6) (2005), doi:10.1103/PhysRevE.71.066208
Santos, J., Marques de Sá, J., Alexandre, L., Sereno, F.: Optimization of the error entropy minimization algorithm for neural network classification. In: ANNIE. Intelligent Engineering Systems Through Art. Neural Net., vol. 14, pp. 81–86. ASME Press, USA (2004)
Silva, L., Marques de Sá, J., Alexandre, L.: Neural Network Classification using Shannon’s Entropy. In: Proceedings of the 13th European Symposium on Artificial Neural Networks, ESANN 2005, Bruges, Belgium, April 27-29 (2005)
Hild, K.E., Erdogmus, D., Torkkola, K., Principe, J.: Feature extraction using information-theoretic learning. IEEE Tran. on Pattern Analysis and Machine Intelligence 28(9), 1385–1392 (2006)
Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases. University of California, Irvine, Dept. of Information and Computer Sciences (1998), http://www.ics.uci.edu/mlearn/MLRepository.html
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Neme, A., Hernández, S., Nido, A., Islas, C. (2012). Multilayer Perceptrons as Classifiers Guided by Mutual Information and Trained with Genetic Algorithms. In: Yin, H., Costa, J.A.F., Barreto, G. (eds) Intelligent Data Engineering and Automated Learning - IDEAL 2012. IDEAL 2012. Lecture Notes in Computer Science, vol 7435. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32639-4_22
Download citation
DOI: https://doi.org/10.1007/978-3-642-32639-4_22
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-32638-7
Online ISBN: 978-3-642-32639-4
eBook Packages: Computer ScienceComputer Science (R0)