Abstract
There is no consensus on measuring distances between two different neural network architectures. Two folds of methods are used for that purpose: Structural and behavioral distance measures. In this paper, we focus on the later one that compares differences based on output responses given the same input. Usually neural network output can be interpreted as a probabilistic function given the input signals if it is normalized to 1. Information theoretic distance measures are widely used to measure distances between two probabilistic distributions. In the framework of evolving diverse neural networks, we adopted information-theoretic distance measures to improve its performance. Experimental results on UCI benchmark dataset show the promising possibility of the approach.
This research was supported by Brain Science and Engineering Research Program sponsored by Korean Ministry of Commerce, Industry and Energy.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Stanely, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evolutionary Computation 10(2), 99–127 (2002)
Richard, M.D., Lippmann, R.P.: Neural network classifiers estimate Bayesian a posteriori probabilities. Neural Computation 3, 461–483 (1991)
Lippmann, R.P.: Neural networks, Bayesian a posteriori probabilities and pattern classification. From Statistics to Neural Networks-Theory and Pattern Recognition Applications (1994)
Liu, Y., Yao, X.: Learning and evolution by minimization of mutual information. In: Guervós, J.J.M., Adamidis, P.A., Beyer, H.-G., Fernández-Villacañas, J.-L., Schwefel, H.-P. (eds.) PPSN 2002. LNCS, vol. 2439, pp. 495–504. Springer, Heidelberg (2002)
Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22, 79–86 (1951)
Yao, X.: Evolving artificial neural networks. Proceedings of the IEEE 87(9), 1423–1447 (1999)
Brown, G., Wyatt, J., Harris, R., Yao, X.: Diversity creation methods: A survey and categorization. Information Fusion 6, 5–20 (2005)
Rogers, A., Prügel-Bennett, A.: Genetic drift in genetic algorithm selection schemes. IEEE Transactions on Evolutionary Computation 3(4), 298–303 (1999)
Huang, Y.S., Suen, C.Y.: Recognition of unconstrained handwritten numerals. IEEE Transactions on Pattern Analysis and Machine Intelligence 17(1), 90–94 (1995)
Chung, A.C.S., Wells III, W.M., et al.: Multi-modal Image Registration by Minimising Kullback-Leibler Distance. In: Dohi, T., Kikinis, R. (eds.) MICCAI 2002. LNCS, vol. 2489, pp. 525–532. Springer, Heidelberg (2002)
Do, M.N., Vetterli, M.: Wavelet-based texture retrieval using generalized Gaussian density and Kullback-Leibler distance. IEEE Transactions on Image Processing 11(2), 146–158 (2002)
Gruner, C.M., Johnson, D.H.: Calculation of the Kullback-Leibler distance between point process models. In: International Conference on Acoustics, Speech, and Signal Processing, pp. 3437–3440 (2001)
Drevin, G.R.: Using entropy to determine the roundness of rock particles. In: 5th International Conference on Signal Processing, pp. 1399–1404 (2000)
Hollmen, J., Tresp, V., Simula, O.: A self-organizing map for clustering probabilistic models. In: Ninth International Conference on Artificial Neural Networks, vol. 2, pp. 946–951 (1999)
Raudys, S., Roli, F.: The behavior knowledge space fusion method: Analysis of generalization error and strategies for performance improvement. In: Windeatt, T., Roli, F. (eds.) MCS 2003. LNCS, vol. 2709, pp. 55–64. Springer, Heidelberg (2003)
Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley-Interscience, Chichester (1991)
Qian, H.: Relative entropy: Free energy associated with equilibrium fluctuations and nonequilibrium deviations. Physical Review E 63, 042103/1–042103/4 (2001)
Yao, X., Liu, Y.: A new evolutionary system for evolving artificial neural networks. IEEE Transactions on Neural Networks 8(3), 694–713 (1997)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2008 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kim, KJ., Cho, SB. (2008). Diverse Evolutionary Neural Networks Based on Information Theory. In: Ishikawa, M., Doya, K., Miyamoto, H., Yamakawa, T. (eds) Neural Information Processing. ICONIP 2007. Lecture Notes in Computer Science, vol 4985. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69162-4_105
Download citation
DOI: https://doi.org/10.1007/978-3-540-69162-4_105
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-69159-4
Online ISBN: 978-3-540-69162-4
eBook Packages: Computer ScienceComputer Science (R0)