Abstract
Neural network design aims for high classification accuracy and low network architecture complexity. It is also known that simultaneous optimization of both model accuracy and complexity improves generalization while avoiding overfitting on data. We describe a neural network training procedure that uses multi-objective optimization to evolve networks which are optimal both with respect to classification accuracy and architectural complexity. The NSGA-II algorithm is employed to evolve a population of neural networks that are minimal in both training error and a Minimum Description Length-based network complexity measure. We further propose a pruning rule based on the following heuristic: connections to or from a node may be severed if their weight values are smaller than the network’s smallest bias. Experiments on benchmark datasets show that the proposed evolutionary multi-objective approach to neural network design employing the bias-based pruning heuristic yields networks that have far fewer connections without seriously compromising generalization performance when compared to other existing evolutionary optimization algorithms.
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Barron, A., Rissanen, J., Yu, B.: The Minimum Description Length Principle in Coding and Modeling. IEEE Trans. Information Theory 44, 2743–2760 (1998)
Deb, K.: Multi-Objective Optimization Using Evolutionary Algorithms. John Wiley & Sons, Chichester (2001)
Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II. IEEE Trans. Evolutionary Computation 6(2), 182–197 (2002)
Fonseca, C.M., Fleming, P.J.: An Overview of Evolutionary Algorithms in Multiobjective Optimization. Evolutionary Computation 3(1), 1–16 (1995)
Fonseca, C.M., Fleming, P.J.: Genetic Algorithms for Multiobjective Optimization: Formulation, Discussion and Generalization. In: Forrest, S. (ed.) Genetic Algorithms: Proceedings of the Fifth International Conference, pp. 416–423. Morgan Kaufmann, San Mateo (1993)
Grunwald, P.: A Tutorial Introduction to the Minimum Description Length Principle. In: Advances in Minimum Description Length: Theory and Applications, MIT Press, Cambridge (2004)
Hinton, G., van Camp, D.: Keeping Neural Networks Simple by Minimizing the Description Length of the Weights. In: Proceedings of COLT-93 (1993)
Horn, J., Nafpliotis, N., Goldberg, D.E.: A Niched Pareto Genetic Algorithm for Multiobjective Optimization. In: Proceedings of the First IEEE Conference on Evolutionary Computation, IEEE World Congress on Computational Intelligence, vol. 1, pp. 82–87. IEEE Service Center, Piscataway (1994)
Newman, D., Hettich, S., Blake, C., Merz, C.: UCI Repository of Machine Learning Databases. University of California, Department of Information and Computer Science, Irvine, CA (1998)
Palmes, P., Hayasaka, T., Usui, S.: Mutation-based Genetic Neural Network. IEEE Trans. Neural Networks 6(3), 587–600 (2005)
Reed, R.: Pruning algorithms - A Survey. IEEE Trans. Neural Networks 4(5), 740–747 (1993)
Shahin, M., Jaksa, M., Maier, H.: Application of Neural Networks in Foundation Engineering. Theme paper to the International e-Conference on Modern Trends in Foundation Engineering: Geotechnical Challenges and Solutions, Theme No. 5: Numerical Modelling and Analysis, Chennai, India (2004)
Yao, X.: Evolving Artificial Neural Networks. Proceedings of the IEEE 87, 1423–1447 (1999)
Yao, X., Liu, Y.: Evolving Artificial Neural Networks through Evolutionary Programming. Presented at the Fifth Annual Conference on Evolutionary Programming, 29 February-2 March 1996, San Diego, CA, USA, pp. 257–266. MIT Press, Cambridge (1996)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Naval, P.C., Yusiong, J.P.T. (2007). An Evolutionary Multi-objective Neural Network Optimizer with Bias-Based Pruning Heuristic. In: Liu, D., Fei, S., Hou, Z., Zhang, H., Sun, C. (eds) Advances in Neural Networks – ISNN 2007. ISNN 2007. Lecture Notes in Computer Science, vol 4493. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72395-0_23
Download citation
DOI: https://doi.org/10.1007/978-3-540-72395-0_23
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-72394-3
Online ISBN: 978-3-540-72395-0
eBook Packages: Computer ScienceComputer Science (R0)