Skip to main content
Log in

Maximum relevancy maximum complementary based ordered aggregation for ensemble pruning

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Ensemble methods have delivered exceptional performance in various applications. However, this exceptional performance is achieved at the expense of heavy storage requirements and slower predictions. Ensemble pruning aims at reducing the complexity of this popular learning paradigm without worsening its performance. This paper presents an efficient and effective ordering-based ensemble pruning methods which ranks all the base classifiers with respect to a maximum relevancy maximum complementary (MRMC) measure. The MRMC measure evaluates the base classifier’s classification ability as well as its complementariness to the ensemble, and thereby a set of accurate and complementary base classifiers can be selected. Moreover, an evaluation function that deliberately favors the candidate sub-ensembles with a better performance in classifying low margin instances has also been proposed. Experiments performed on 25 benchmark datasets demonstrate the effectiveness of our proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Laradji IH, Alshayeb M, Ghouti L (2015) Software defect prediction using ensemble learning on selected features. Inf Softw Technol 58:388–402

    Article  Google Scholar 

  2. Idris A, Khan A, Lee YS (2013) Intelligent churn prediction in telecom: employing mRMR feature selection and RotBoost based ensemble classification. Appl Intell 39(3):659–672

    Article  Google Scholar 

  3. Kuncheva LI, Whitaker CJ (2003) Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach Learn 51(2):181–207

    Article  MATH  Google Scholar 

  4. Partalas I, Tsoumakas G, Vlahavas I (2009) Pruning an ensemble of classifiers via reinforcement learning. Neurocomputing 72(7–9):1900–1909

    Article  Google Scholar 

  5. Tamon C, Xiang J (2000) On the boosting pruning problem. In: European conference on machine learning

  6. Kittler J, Hatef M, Duin RPW, Matas J (1998) On combining classifiers. IEEE Trans Pattern Anal Mach Intell 20(3):226– 239

    Article  Google Scholar 

  7. Britto AS, Sabourin R, Oliveira LES (2014) Dynamic selection of classifiers—a comprehensive review. Pattern Recogn 47(11):3665–3680

    Article  Google Scholar 

  8. Haghighi MS, Vahedian A, Yazdi HS (2012) Making diversity enhancement based on multiple classifier system by weight tuning. Neural Process Lett 35(1):61–80

    Article  Google Scholar 

  9. Wang L, Sugiyama M, Jing Z, Yang C, Zhou ZH, Feng J (2011) A refined margin analysis for boosting algorithms via equilibrium margin. J Mach Learn Res 12(2):1835–1863

    MathSciNet  MATH  Google Scholar 

  10. Sun B, Chen H, Wang J (2015) An empirical margin explanation for the effectiveness of DECORATE ensemble learning algorithm. Knowl-Based Syst 78:1–12

    Article  Google Scholar 

  11. Tang EK, Suganthan PN, Yao X (2006) An analysis of diversity measures. Mach Learn 65(1):247–271

    Article  Google Scholar 

  12. Ko AHR, Sabourin R, Britto ADS Jr, Oliveira L (2007) Pairwise fusion matrix for combining classifiers. Pattern Recogn 40(8):2198–2210

    Article  MATH  Google Scholar 

  13. Tsch G, Warmuth MK (2005) Efficient margin maximizing with boosting. J Mach Learn Res 6:2131–2152

    MathSciNet  MATH  Google Scholar 

  14. Shen C, Li H (2010) Boosting through optimization of margin distributions. IEEE Trans Neural Netw 21 (4):659–666

    Article  Google Scholar 

  15. Dai Q, Han XM (2016) An efficient ordering-based ensemble pruning algorithm via dynamic programming. Appl Intell 44(4):816–830

    Article  MathSciNet  Google Scholar 

  16. Cavalcanti GDC, Oliveira LS, Moura TJM, Carvalho GV (2016) Combining diversity measures for ensemble pruning. Pattern Recogn Lett 74:38–45

    Article  Google Scholar 

  17. Yin XC, Huang K, Hao HW, Iqbal K, Wang ZB (2014) A novel classifier ensemble method with sparsity and diversity. Neurocomputing 134(134):214–221

    Article  Google Scholar 

  18. Ykhlef H, Bouchaffra D (2017) An efficient ensemble pruning approach based on simple coalitional games. Information Fusion 34:28–42

    Article  Google Scholar 

  19. Margineantu DD, Dietterich TG (1997) Pruning adaptive boosting. In: Proceedings of the fourteenth international conference on machine learning. Morgan Kaufmann Publishers Inc, pp 211– 218

  20. Zhang Y, Burer S, Street WN (2006) Ensemble pruning via semi-definite programming. J Mach Learn Res 7(3):1315–1338

    MathSciNet  MATH  Google Scholar 

  21. Zhang H, Cao L (2014) A spectral clustering based ensemble pruning approach. Neurocomputing 139 (139):289–297

    Article  Google Scholar 

  22. Bakker B, Heskes T (2003) Clustering ensembles of neural network models. Neural Netw 16(2):261–269

    Article  Google Scholar 

  23. Xie Z, Xu Y, Hu Q, Zhu P (2012) Margin distribution based bagging pruning. Neurocomputing 85:11–19

    Article  Google Scholar 

  24. Yang F, Lu WH, Luo LK, Li T (2012) Margin optimization based pruning for random forest. Neurocomputing 94(3):54–63

    Article  Google Scholar 

  25. Li L, Zou B, Hu Q, Wu X, Yu D (2013) Dynamic classifier ensemble using classification confidence. Neurocomputing 99:581–591

    Article  Google Scholar 

  26. Guo L, Boukir S (2013) Margin-based ordered aggregation for ensemble pruning. Pattern Recogn Lett 34 (6):603–609

    Article  Google Scholar 

  27. Dai Q, Yao CS (2016) A hierarchical and parallel branch-and-bound ensemble selection algorithm. Appl Intell 1–17

  28. Dai Q (2013) A competitive ensemble pruning approach based on cross-validation technique. Knowl-Based Syst 37(2):394–414

    Article  MathSciNet  Google Scholar 

  29. Zhao QL, Jiang YH, Xu M (2009) A fast ensemble pruning algorithm based on pattern mining process. Data Min Knowl Disc 19(2):277–292

    Article  MathSciNet  Google Scholar 

  30. Zhou H, Zhao X, Wang X (2014) An effective ensemble pruning algorithm based on frequent patterns. Knowl-Based Syst 56(3):79–85

    Article  Google Scholar 

  31. Krawczyk B, Woźniak M (2016) Untrained weighted classifier combination with embedded ensemble pruning. Neurocomputing 196:14–22

    Article  Google Scholar 

  32. Özögür-Akyüz S, Windeatt T, Smith R (2015) Pruning of error correcting output codes by optimization of accuracy—diversity trade off. Mach Learn 101(1):1–17

    MathSciNet  MATH  Google Scholar 

  33. Peng H, Long F, Ding C (2005) Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans Pattern Anal Mach Intell 27(8):1226–1238

    Article  Google Scholar 

  34. Chernbumroong S, Shuang C, Yu H (2015) Maximum relevancy maximum complementary feature selection for multi-sensor activity recognition. Expert Syst Appl 42(1):573–583

    Article  Google Scholar 

  35. Shannon CEA (2001) A mathematical theory of communication. AT&T Tech J Acm Sigmobile Mobile Computing & Communications Review 5(1):3–55

    Article  MathSciNet  Google Scholar 

  36. Tsymbal A, Pechenizkiy M, Cunningham P (2005) Diversity in search strategies for ensemble feature selection. Information Fusion 6(1):83–98

    Article  Google Scholar 

  37. Asuncion A, Newman D (2007) UCI machine learning repository [Online]. Available: http://www.ics.uci.edu/mlearn/MLRepository.html

  38. Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. SIGKDD Explor Newsl 11(1):10–18

    Article  Google Scholar 

  39. Martinez-Muoz G, Hernandez-Lobato D, Suarez A (2009) An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Trans Pattern Anal Mach Intell 31(2):245– 59

    Article  Google Scholar 

  40. Rodriguez JJ, Kuncheva LI, Alonso CJ (2006) Rotation forest: a new classifier ensemble method. IEEE Trans Pattern Anal Mach Intell 28(10):1619–30

    Article  Google Scholar 

  41. Mukherjee I, Schapire RE (2011) A theory of multiclass boosting. J Mach Learn Res 14(1):437–497

    MathSciNet  MATH  Google Scholar 

  42. Hodges JL, Lehmann EL (1962) Rank methods for combination of independent experiments in analysis of variance. Ann Math Stat 33(2):482–497

    Article  MathSciNet  MATH  Google Scholar 

  43. Holm S (1979) A simple sequentially rejective multiple test procedure. Scand J Stat 6(2):65–70

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tao Lin.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Xia, X., Lin, T. & Chen, Z. Maximum relevancy maximum complementary based ordered aggregation for ensemble pruning. Appl Intell 48, 2568–2579 (2018). https://doi.org/10.1007/s10489-017-1106-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-017-1106-x

Keywords

Navigation