single-jc.php

JACIII Vol.15 No.6 pp. 652-661
doi: 10.20965/jaciii.2011.p0652
(2011)

Paper:

Hybrid Ensemble Construction with Selected Neural Networks

M. A. H. Akhand*, Pintu Chandra Shill**,
and Kazuyuki Murase**

*Dept. of Computer Science and Engineering, Khulna University of Engineering & Technology (KUET), Khulna 9203, Bangladesh

**Dept. of Human and Artificial Intelligence Systems, Graduate School of Engineering, University of Fukui

Received:
December 24, 2010
Accepted:
May 6, 2011
Published:
August 20, 2011
Keywords:
diversity, generalization, neural network ensemble, network selection
Abstract
A Neural Network Ensemble (NNE) is convenient for improving classification task performance. Among the remarkable number of methods based on different techniques for constructing NNEs, Negative Correlation Learning (NCL), bagging, and boosting are the most popular. None of them, however, could show better performance for all problems. To improve performance combining the complementary strengths of the individual methods, we propose two different ways to construct hybrid ensembles combining NCL with bagging and boosting. One produces a pool of predefined numbers of networks using standard NCL and bagging (or boosting) and then uses a genetic algorithm to select an optimal network subset for an NNE from the pool. Results of experiments confirmed that our proposals show consistently better performance with concise ensembles than conventional methods when tested using a suite of 25 benchmark problems.
Cite this article as:
M. Akhand, P. Shill, and K. Murase, “Hybrid Ensemble Construction with Selected Neural Networks,” J. Adv. Comput. Intell. Intell. Inform., Vol.15 No.6, pp. 652-661, 2011.
Data files:
References
  1. [1] A. J. C. Sharkey, “On combining artificial neural nets,” Connection Science, Vol.8, No.3-4, pp. 299-314, 1996.
  2. [2] A. J. C. Sharkey and N. E. Sharkey, “Combining Diverse Neural Nets,” Knowledge Engineering Review, Vol.12, No.3, pp. 299-314,1997.
  3. [3] D. W. Opitz and R. Maclin, “Popular ensemble methods: An empirical study,” J. of Artificial Intelligence Research, Vol.11, pp. 169-198, 1999.
  4. [4] M. A. H. Akhand, Md. Monirul Islam, and K. Murase, “A Comparative Study of Data Sampling Techniques for Constructing Neural Network Ensembles,” Int. J. of Neural Systems, Vol.19, No.2, pp. 67-89, 2009.
  5. [5] G. Brown, J. Wyatt, R. Harris, and X. Yao, “Diversity Creation Methods: A Survey and Categorization,” Information Fusion, Vol.6, pp. 99-111, 2005.
  6. [6] E. Bauter and R. Kohavi, “An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants,” Machine Learning, Vol.36, 105-142, 1999.
  7. [7] Md. Monirul Islam, X. Yao, and K. Murase, “A constructive algorithm for training cooperative neural network ensembles,” IEEE Trans. Neural Networks, Vol.14, No.4, pp. 820-834, 2003.
  8. [8] T. K. Ho, “The random subspace method for constructing decision forests,” IEEE Trans. on Pattern Analysis and Machine Intelligence, Vol.20, pp. 832-844, 1998.
  9. [9] G. Martínez-Muñoz, A. Sánchez-Martínez, D. Hernández-Lobatoand, and A. Suárez, Class-switching neural network ensembles, Neurocomputing, Vol.71, pp. 2521-2528, 2008.
  10. [10] P. Melville and R. J. Mooney, “Creating diversity in ensembles using artificial data,” Information Fusion, Vol.6, pp. 99-111, 2005.
  11. [11] D. J. Newman, S. Hettich, C. L. Blake, and C. J.Merz, “UCI Repository of Machine Learning Databases,” Dept. of Information and Computer Sciences, University of California, Irvine, 1998.
    http://www.ics.uci.edu/˜mlearn/MLRepository.html
  12. [12] Md.Monirul Islam, X. Yao, S.M. Shahriar Nirjon, M. A. Islam, and K. Murase, “Bagging and Boosting Negatively Correlated Neural Networks,” IEEE Trans. On Systems, Man, and Cybernetics (B), Vol.38, No.3, pp. 771-784, 2008.
  13. [13] M. A. H. Akhand, Md. Monirul Islam, and K. Murase, “Progressive Interactive Training: A Sequential Neural Network Ensemble Learning Method,” Neurocomputing, Vol.73, pp. 260-273, 2009.
  14. [14] L. Breiman, “Bagging predictors,” Machine Learning, Vol.24, pp. 123-140, 1996.
  15. [15] Y. Freund and R. E. Schapire, “Experiments with a new boosting algorithm,” Proc. of the 13th Int. Conf. onMachine Learning, Morgan kaufmann, pp. 148-156, 1996.
  16. [16] Y. Liu and X. Yao, “Ensemble learning via negative correlation,” Neural Networks, Vol.12, pp. 1399-1404, 1999.
  17. [17] Y. Liu and X. Yao, “Simultaneous Training of Negatively Correlated Neural Networks in an Ensemble,” IEEE Trans. on Systems, Man, and Cybernetics, Part B: Cybernetics, Vol.29, No.6, pp. 716-725, 1999.
  18. [18] Y. Liu, “How to generate different neural networks,” Studies in Computational Intelligence (SCI), Vol.35, pp. 225-240, 2007.
  19. [19] D. E. Goldberg, “Genetic Algorithms,” Addison-wesley, 1998.
  20. [20] Z. Zhou, J.Wu, andW. Tang, “Ensembling Neural Networks: Many Could Be Better Than All,” Artificial Intelligence, Vol.137, pp. 239-263, 2002.
  21. [21] Z. Zhou, J. Wu , W. Tang, and Z. Chen, “Selectively Ensembling Neural Classifiers,” Proc. of the 2002 International Joint Conference on Neural Networks (IJCNN2002), Honolulu, HI, USA, pp. 1411-1415, 2002.
  22. [22] L. Prechelt, “Proben1- A Set of Benchmarks and Benching Rules for Neural Network Training Algorithms,” Tech. Rep. 21/94, Fakultat fur Informatik, University of Karlsruhe, Germany, 1994.
  23. [23] S. Haykin, “Neural Networks – A Comprehensive Foundation,” Prentice Hall, 2nd edition, 1999.
  24. [24] A. Tsymbal, M. Pechenizkiy, and P. Cunningham, “Diversity in Search Strategies for Ensemble Feature Selection,” Information Fusion, Vol.6, pp. 83-98, 2005.

*This site is desgined based on HTML5 and CSS3 for modern browsers, e.g. Chrome, Firefox, Safari, Edge, Opera.

Last updated on Apr. 22, 2024