Skip to main content

Diverse Evolutionary Neural Networks Based on Information Theory

  • Conference paper
Neural Information Processing (ICONIP 2007)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4985))

Included in the following conference series:

Abstract

There is no consensus on measuring distances between two different neural network architectures. Two folds of methods are used for that purpose: Structural and behavioral distance measures. In this paper, we focus on the later one that compares differences based on output responses given the same input. Usually neural network output can be interpreted as a probabilistic function given the input signals if it is normalized to 1. Information theoretic distance measures are widely used to measure distances between two probabilistic distributions. In the framework of evolving diverse neural networks, we adopted information-theoretic distance measures to improve its performance. Experimental results on UCI benchmark dataset show the promising possibility of the approach.

This research was supported by Brain Science and Engineering Research Program sponsored by Korean Ministry of Commerce, Industry and Energy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Stanely, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evolutionary Computation 10(2), 99–127 (2002)

    Article  Google Scholar 

  2. Richard, M.D., Lippmann, R.P.: Neural network classifiers estimate Bayesian a posteriori probabilities. Neural Computation 3, 461–483 (1991)

    Article  Google Scholar 

  3. Lippmann, R.P.: Neural networks, Bayesian a posteriori probabilities and pattern classification. From Statistics to Neural Networks-Theory and Pattern Recognition Applications (1994)

    Google Scholar 

  4. Liu, Y., Yao, X.: Learning and evolution by minimization of mutual information. In: Guervós, J.J.M., Adamidis, P.A., Beyer, H.-G., Fernández-Villacañas, J.-L., Schwefel, H.-P. (eds.) PPSN 2002. LNCS, vol. 2439, pp. 495–504. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  5. Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22, 79–86 (1951)

    Article  MathSciNet  Google Scholar 

  6. Yao, X.: Evolving artificial neural networks. Proceedings of the IEEE 87(9), 1423–1447 (1999)

    Article  Google Scholar 

  7. Brown, G., Wyatt, J., Harris, R., Yao, X.: Diversity creation methods: A survey and categorization. Information Fusion 6, 5–20 (2005)

    Article  Google Scholar 

  8. Rogers, A., Prügel-Bennett, A.: Genetic drift in genetic algorithm selection schemes. IEEE Transactions on Evolutionary Computation 3(4), 298–303 (1999)

    Article  Google Scholar 

  9. Huang, Y.S., Suen, C.Y.: Recognition of unconstrained handwritten numerals. IEEE Transactions on Pattern Analysis and Machine Intelligence 17(1), 90–94 (1995)

    Article  Google Scholar 

  10. Chung, A.C.S., Wells III, W.M., et al.: Multi-modal Image Registration by Minimising Kullback-Leibler Distance. In: Dohi, T., Kikinis, R. (eds.) MICCAI 2002. LNCS, vol. 2489, pp. 525–532. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  11. Do, M.N., Vetterli, M.: Wavelet-based texture retrieval using generalized Gaussian density and Kullback-Leibler distance. IEEE Transactions on Image Processing 11(2), 146–158 (2002)

    Article  MathSciNet  Google Scholar 

  12. Gruner, C.M., Johnson, D.H.: Calculation of the Kullback-Leibler distance between point process models. In: International Conference on Acoustics, Speech, and Signal Processing, pp. 3437–3440 (2001)

    Google Scholar 

  13. Drevin, G.R.: Using entropy to determine the roundness of rock particles. In: 5th International Conference on Signal Processing, pp. 1399–1404 (2000)

    Google Scholar 

  14. Hollmen, J., Tresp, V., Simula, O.: A self-organizing map for clustering probabilistic models. In: Ninth International Conference on Artificial Neural Networks, vol. 2, pp. 946–951 (1999)

    Google Scholar 

  15. Raudys, S., Roli, F.: The behavior knowledge space fusion method: Analysis of generalization error and strategies for performance improvement. In: Windeatt, T., Roli, F. (eds.) MCS 2003. LNCS, vol. 2709, pp. 55–64. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  16. Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley-Interscience, Chichester (1991)

    MATH  Google Scholar 

  17. Qian, H.: Relative entropy: Free energy associated with equilibrium fluctuations and nonequilibrium deviations. Physical Review E 63, 042103/1–042103/4 (2001)

    Google Scholar 

  18. Yao, X., Liu, Y.: A new evolutionary system for evolving artificial neural networks. IEEE Transactions on Neural Networks 8(3), 694–713 (1997)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Masumi Ishikawa Kenji Doya Hiroyuki Miyamoto Takeshi Yamakawa

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kim, KJ., Cho, SB. (2008). Diverse Evolutionary Neural Networks Based on Information Theory. In: Ishikawa, M., Doya, K., Miyamoto, H., Yamakawa, T. (eds) Neural Information Processing. ICONIP 2007. Lecture Notes in Computer Science, vol 4985. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69162-4_105

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-69162-4_105

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-69159-4

  • Online ISBN: 978-3-540-69162-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics