Abstract
Neural network ensemble (NNE) has been shown to outperform single neural network (NN) in terms of generalization ability. The performance of NNE is therefore depends on well diversity among component NNs. Popular NNE methods, such as bagging and boosting, follow data sampling technique to achieve diversity. In such methods, NN is trained independently with a particular training set that is probabilistically created. Due to independent training strategy there is a lack of interaction among component NNs. To achieve training time interaction, negative correlation learning (NCL) has been proposed for simultaneous training. NCL demands direct communication among component NNs; which is not possible in bagging and boosting. In this study, first we modify the NCL from simultaneous to sequential style and then induce in bagging and boosting for interaction purpose. Empirical studies exhibited that sequential training time interaction increased diversity among component NNs and outperformed conventional methods in generalization ability.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Sharkey, A.J.C., Sharkey, N.E.: Combining Diverse Neural Nets. Knowledge Engineering Review 12, 299–314 (1997)
Brown, G., Wyatt, J., Harris, R., Yao, X.: Diversity Creation Methods: A Survey and Categorization. Information Fusion 6, 99–111 (2005)
Opitz, D.W., Maclin, R.: Popular Ensemble Methods: An Empirical Study. Journal of Artificial Intelligence Research 11, 169–198 (1999)
Breiman, L.: Bagging Predictors. Machine Learning 24, 123–140 (1996)
Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. In: Proc. of the 13th International Conference on Machine Learning, pp. 148–156. Morgan kaufmann, San Francisco (1996)
Bauter, E., Kohavi, R.: An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants. Machine Learning 36, 105–142 (1999)
Liu, Y., Yao, X.: Simultaneous training of negatively correlated neural networks in an ensemble. IEEE Trans. on Systems, Man, and Cybernetics 29, 716–725 (1999)
Liu, Y., Yao, X.: Ensemble Learning via Negative Correlation. Neural Networks 12, 1399–1404 (1999)
Islam, M.M., Yao, X., Murase, K.: A Constructive Algorithm for Training neural Network Ensembles. IEEE Transactions on Neural Networks 14, 820–834 (2003)
Tsymbal, A., Pechenizkiy, M., Cunningham, P.: Diversity in Search Strategies for Ensemble Feature Selection. Information Fusion 6, 83–98 (2005)
Liu, Y.: Generate Different Neural Networks by Negative Correlation Learning. In: Wang, L., Chen, K., Ong, Y.S. (eds.) ICNC 2005. LNCS, vol. 3610, pp. 149–156. Springer, Heidelberg (2005)
Prechelt, L.: Proben1- A Set of Benchmarks and Benching Rules for Neural Network Training Algorithms, Tech. Rep. 21/94, University of Karlsruhe, Germany (1994)
Kuncheva, L., Skurichina, M., Duin, R.: An experimental study on diversity for bagging and boosting with linear classifiers. Information Fusion 3, 245–258 (2002)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Akhand, M.A.H., Murase, K. (2007). Neural Network Ensemble Training by Sequential Interaction. In: de Sá, J.M., Alexandre, L.A., Duch, W., Mandic, D. (eds) Artificial Neural Networks – ICANN 2007. ICANN 2007. Lecture Notes in Computer Science, vol 4668. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74690-4_11
Download citation
DOI: https://doi.org/10.1007/978-3-540-74690-4_11
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-74689-8
Online ISBN: 978-3-540-74690-4
eBook Packages: Computer ScienceComputer Science (R0)