Skip to main content
Log in

An efficient ordering-based ensemble pruning algorithm via dynamic programming

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Although ordering-based pruning algorithms possess relatively high efficiency, there remains room for further improvement. To this end, this paper describes the combination of a dynamic programming technique with the ensemble-pruning problem. We incorporate dynamic programming into the classical ordering-based ensemble-pruning algorithm with complementariness measure (ComEP), and, with the help of two auxiliary tables, propose a reasonably efficient dynamic form, which we refer to as ComDPEP. To examine the performance of the proposed algorithm, we conduct a series of simulations on four benchmark classification datasets. The experimental results demonstrate the significantly higher efficiency of ComDPEP over the classic ComEP algorithm. The proposed ComDPEP algorithm also outperforms two other state-of-the-art ordering-based ensemble-pruning algorithms, which use uncertainty weighted accuracy and reduce-error pruning, respectively, as their measures. It is noteworthy that, the effectiveness of ComDPEP is just the same with that of the classical ComEP algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Martinez-Munoz G, Hernandez-Lobato D, Suarez A (2009) An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Trans Pattern Anal Machine Intell 31:245–259

    Article  Google Scholar 

  2. Dai Q, Liu Z (2013) ModEnPBT: a modified backtracking ensemble pruning algorithm. Appl Soft Comput 13:4292–4302

    Article  Google Scholar 

  3. Margineantu D, Dietterich T (1997) Pruning adaptive boosting. In: Proceedings of the 14th international conference on machine learning

  4. Prodromidis AL, Stolfo SJ (2001) Cost complexity-based pruning of ensemble classifiers. Knowl Inf Syst 3:449–469

    Article  MATH  Google Scholar 

  5. Partalas I, Tsoumakas G, Vlahavas I (2010) An ensemble uncertainty aware measure for directed hill climbing ensemble pruning. Mach Learn 81:257–282

    Article  MathSciNet  Google Scholar 

  6. Martinez-Munoz G, Suarez A (2004) Aggregation ordering in bagging. In: International conference on artificial intelligence and applications

  7. Xu L, Li B, Chen E (2012) Ensemble pruning via constrained eigen-optimization. In: IEEE 12th international conference on data mining (ICDM), Brussels

  8. Zhou Z-H (2012) Ensemble methods: foundations and algorithms. Chapman and Hall/CRC

  9. Tsoumakas G, Partalas I, Vlahavas I (2009) Applications of supervised and unsupervised ensemble methods, vol 245. Springer, Berlin

    Google Scholar 

  10. Partridge D, Yates WB (1996) Engineering multiversion neural-net systems. Neural Comput 8:869–893

    Article  Google Scholar 

  11. Yang Y, Korb K, Ting K, Webb G (2005) Ensemble selection for superparent-one-dependence estimators. In: 2005: advances in artificial intelligence

  12. Tsoumakas G, Partalas I (2009) An ensemble pruning primer. In: Applications of supervised and unsupervised ensemble methods. Studies in computational intelligence, vol 245, pp 1–13

  13. Martinez-Munoz G, Suarez A (2006) Pruning in ordered bagging ensembles. In: 23rd international conference in machine learning

  14. Giacinto G, Roli F, Fumera G (2000) Design of effective multiple classifier systems by clustering of classifiers. In: 15th international conference on pattern recognition

  15. Kuncheva L, Whitaker CJ (2003) Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach Learn 51:181–207

    Article  MATH  Google Scholar 

  16. Fu Q, Hu SX, Zhao SY (2005) Clustering-based selective neural network ensemble. J Zhejiang Univ Sci 6A:387–392

    Article  Google Scholar 

  17. Lazarevic A, Obradovic Z (2001) The effective pruning of neural network classifiers. In: 2001 IEEE/INNS international conference on neural networks, IJCNN

  18. Zhou Z, Tang W (2003) Selective ensemble of decision trees. In: Proceedings of the 9th international conference on rough sets, fuzzy sets, data mining, and granular computing, Chongqing, China

  19. Zhang Y, Burer S, Street WN (2006) Ensemble pruning via semi-definite programming. J Mach Learn Res 7:1315–1338

    MathSciNet  MATH  Google Scholar 

  20. Caruana R, Niculescu-Mizil A, Crew G, Ksikes A (2004) Ensemble selection from libraries of models. In: Proceedings of the 21st international conference on machine learning

  21. Banfield RE, Hall LO, Bowyer KW, Kegelmeyer WP (2005) Ensemble diversity measures and their application to thinning. Information Fusion 6:49–62

    Article  Google Scholar 

  22. Martinez-Munoz G, Suarez A (2007) Using boosting to prune bagging ensembles. Pattern Recogn Lett 28:156–165

    Article  Google Scholar 

  23. Partalas I, Tsoumakas G, Vlahavas I (2009) Pruning an ensemble of classifiers via reinforcement learning. Neurocomputing 72:1900–1909

    Article  Google Scholar 

  24. Alsuwaiyel MH (2003) Algorithms design techniques and analysis. World Scientific, Singapore

    MATH  Google Scholar 

  25. Dai Q, Liu NZ (2011) The build of n-Bits binary coding ICBP ensemble system. Neurocomputing 74:3509–3519

    Article  Google Scholar 

  26. Dai Q, Chen SC, Zhang BZ (2003) Improved CBP neural network model with applications in time series prediction. Neural Process Lett 18:197–211

    Google Scholar 

  27. Dai Q (2013) A novel ensemble pruning algorithm based on randomized greedy selective strategy and ballot. Neurocomputing 122:258–265

    Article  Google Scholar 

  28. Dai Q (2013) A competitive ensemble pruning approach based on cross-validation technique. Knowl-Based Syst 37:394–414

    Article  Google Scholar 

  29. Dai Q (2013) An efficient ensemble pruning algorithm using one-path and two-trips searching approach. Knowl-Based Syst 51:85–92

    Article  Google Scholar 

  30. Dai Q, Zhang T, Liu N (2015) A new reverse reduce-error ensemble pruning algorithm. Appl Soft Comput 28:237–249

    Article  Google Scholar 

  31. Liu Z, Dai Q, Liu N (2014) Ensemble selection by GRASP. Appl Intell 41:128–144

    Article  MathSciNet  Google Scholar 

  32. Dai Q, Li M (2015) Introducing randomness into greedy ensemble pruning algorithms. Appl Intell 42:406–429

    Article  Google Scholar 

  33. http://www.ics.uci.edu/~mlearn/MLRepository.html or ftp.ics.uci.edu:pub/machine-learning-databases

  34. Haykin S (1999) Neural networks a comprehensive foundation. Prentice-Hall, Englewood Cliffs

    MATH  Google Scholar 

  35. Huang G-B, Zhu Q-Y, Siew C-K (2006) Real-time learning capability of neural networks. IEEE Trans Neural Netw 17:863–878

    Article  Google Scholar 

Download references

Acknowledgments

This work is supported by the National Natural Science Foundation of China under the Grant no. 61473150.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qun Dai.

Ethics declarations

Conflict of Interest

The authors declare that they have no conflict of interest.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional informed consent was obtained from all individual participants for whom identifying information is included in this article.

Additional information

Research involving Human Participants and/or Animals

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

For this type of study formal consent is not required.

All applicable international, national, and/or institutional guidelines for the care and use of animals were followed.

Appendices

Appendix A: Formal procedure of the ComEP algorithm

The formal procedure of the ComEP algorithm is as follows.

figure c

Appendix B: Detailed computational rules for Table 2

The detailed computational rules for Table 2, viz. ClassVoteCounts, are as follows.

figure d

Appendix C: Formal procedure of the ComDPEP algorithm

The formal procedure of the ComDPEP algorithm is as follows.

figure e

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dai, Q., Han, X. An efficient ordering-based ensemble pruning algorithm via dynamic programming. Appl Intell 44, 816–830 (2016). https://doi.org/10.1007/s10489-015-0729-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-015-0729-z

Keywords

Navigation