Skip to main content

Multilayer Perceptrons as Classifiers Guided by Mutual Information and Trained with Genetic Algorithms

  • Conference paper
Intelligent Data Engineering and Automated Learning - IDEAL 2012 (IDEAL 2012)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 7435))

Abstract

Multilayer perceptrons can be trained with several algorithms and with different quantities that correlate the expected output and the achieved state. Among the most common of those quantities is the mean square error, but information-theoretic quantities have been applied with great success. A common scheme to train multilayer perceptrons is based in evolutionary computing, as a counterpart of the commonly applied backpropagation algorithm. In this contribution we evaluated the performance of multilayer perceptrons as classifiers when trained with genetic algorithms and applying mutual information between the label obtained by the network and the expected class. We propose a classification algorithm in which each input variable is substituted by a function of it such that mutual information from the new function to the label is maximized. Next, those approximated functions are fed as input to a multilayer perceptron in charge of learning the classification map, trained with genetic algorithms and guided by mutual information.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Rojas, R.: Neural networks, a systematic introduction. Springer (1996)

    Google Scholar 

  2. Principe, J.C. (ed.): Information Theoretic Learning: Renyi’s Entropy and Kernel Perspectives. Springer (2010)

    Google Scholar 

  3. Xu, D., Principe, J.C.: Training MLPs layer-by-layer with the information potential. In: Intl. Joint Conf. on Neural Networks, Washington, vol. 8, pp. 1716–1720 (1999)

    Google Scholar 

  4. Shannon, C.E.: A Mathematical Theory of Communication. Bell System Technical Journal 27, 379–423, 623–656 (July & October 1948)

    Google Scholar 

  5. Cellucci, C.J., Albano, A.M., College, B., Rapp, P.E.: Statistical Validation of Mutual Information Calculations: Comparison of Alternative Numerical Algorithms. Physical Review E 71(6) (2005), doi:10.1103/PhysRevE.71.066208

    Google Scholar 

  6. Santos, J., Marques de Sá, J., Alexandre, L., Sereno, F.: Optimization of the error entropy minimization algorithm for neural network classification. In: ANNIE. Intelligent Engineering Systems Through Art. Neural Net., vol. 14, pp. 81–86. ASME Press, USA (2004)

    Google Scholar 

  7. Silva, L., Marques de Sá, J., Alexandre, L.: Neural Network Classification using Shannon’s Entropy. In: Proceedings of the 13th European Symposium on Artificial Neural Networks, ESANN 2005, Bruges, Belgium, April 27-29 (2005)

    Google Scholar 

  8. Hild, K.E., Erdogmus, D., Torkkola, K., Principe, J.: Feature extraction using information-theoretic learning. IEEE Tran. on Pattern Analysis and Machine Intelligence 28(9), 1385–1392 (2006)

    Article  Google Scholar 

  9. Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases. University of California, Irvine, Dept. of Information and Computer Sciences (1998), http://www.ics.uci.edu/mlearn/MLRepository.html

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Neme, A., Hernández, S., Nido, A., Islas, C. (2012). Multilayer Perceptrons as Classifiers Guided by Mutual Information and Trained with Genetic Algorithms. In: Yin, H., Costa, J.A.F., Barreto, G. (eds) Intelligent Data Engineering and Automated Learning - IDEAL 2012. IDEAL 2012. Lecture Notes in Computer Science, vol 7435. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-32639-4_22

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-32639-4_22

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-32638-7

  • Online ISBN: 978-3-642-32639-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics