Skip to main content

Building Weighted Classifier Ensembles Through Classifiers Pruning

  • Conference paper
  • First Online:
Book cover Internet Multimedia Computing and Service (ICIMCS 2017)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 819))

Included in the following conference series:

  • 1398 Accesses

Abstract

Many theoretical or experimental studies has shown that ensemble learning is an effective technique to achieve better classification accuracy and stability than individual classifiers. In this paper, we propose a novel weighted classifier ensemble method through classifiers pruning with two stages. In the first stage, we use canonical correlation analysis (CCA) to model maximum correlation relationships between training data points and base classifiers. Based on such globally multi-linear projections, a sparse regression method is proposed to prune base classifiers so that each test data point will dynamically select a subset of classifiers to form a unique classifier ensemble, to decrease effects of noisy input data and incorrect classifiers in such a global view. In the second stage, the pruned classifiers are weighted locally by a fusion method, which utilizes the generalization ability of pruned classifiers among nearest neighbors of testing data points. By this way, each test data point can build a unique locally weighted classifier ensemble. Analysis of experimental results on several UCI data sets shows that the classification results of our method are better than other ensemble methods such as Random Forests, Majority Voting, AdaBoost and DREP.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 107.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Zhou, Z.H.: Ensemble Methods: Foundations and Algorithms. Taylor & Francis, Abingdon (2012)

    Google Scholar 

  2. Liu, J., Shang, S., Zheng, K., Wen, J.R.: Multi-view ensemble learning for dementia diagnosis from neuroimaging: an artificial neural network approach. Neurocomputing 195, 112–116 (2016)

    Article  Google Scholar 

  3. Ghorai, S., Mukherjee, A., Sengupta, S., Dutta, P.K.: Cancer classification from gene expression data by NPPC ensemble. IEEE/ACM Trans. Comput. Biol. Bioinform. 8(3), 659–671 (2010)

    Article  Google Scholar 

  4. Liu, L., Shao, L., Rockett, P.: Boosted key-frame selection and correlated pyramidal motion-feature representation for human action recognition. Pattern Recogn. 46(7), 1810–1818 (2013)

    Article  Google Scholar 

  5. Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  6. Schapire, R.E.: The Boosting Approach to Machine Learning: An Overview. Springer, New York (2002). https://doi.org/10.1007/978-0-387-21579-2_9

    MATH  Google Scholar 

  7. Liaw, A., Wiener, M.: Classification and regression by randomforest. R News 2(3), 18–22 (2002)

    Google Scholar 

  8. Tsoumakas, G., Partalas, I., Vlahavas, I.: An ensemble pruning primer. Stud. Comput. Intell. 245, 1–13 (2009)

    Google Scholar 

  9. Martinezmuoz, G., Hernandezlobato, D., Suarez, A.: An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Trans. Pattern Anal. Mach. Intell. 31(2), 245–259 (2009)

    Article  Google Scholar 

  10. Li, N., Yu, Y., Zhou, Z.-H.: Diversity regularized ensemble pruning. In: Flach, P.A., De Bie, T., Cristianini, N. (eds.) ECML PKDD 2012. LNCS (LNAI), vol. 7523, pp. 330–345. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33460-3_27

    Chapter  Google Scholar 

  11. Zhang, L., Zhou, W.D.: Sparse ensembles using weighted combination methods based on linear programming. Pattern Recogn. 44(1), 97–106 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  12. Liu, T., Tao, D.: Classification with noisy labels by importance reweighting. IEEE Trans. Pattern Anal. Mach. Intell. 38(3), 447–461 (2016)

    Article  MathSciNet  Google Scholar 

  13. Mao, S., Jiao, L., Xiong, L., Gou, S., Chen, B., Yeung, S.K.: Weighted classifier ensemble based on quadratic form. Pattern Recogn. 48(5), 1688–1706 (2014)

    Article  MATH  Google Scholar 

  14. Hardoon, D.R., Szedmak, S., Shawetaylor, J.: Canonical correlation analysis: an overview with application to learning methods. Neural Comput. 16(12), 2639–2664 (2004)

    Article  MATH  Google Scholar 

  15. Blake, C.: UCI repository of machine learning databases (1998). http://www.ics.uci.edu/~mlearn/MLRepository.html

  16. Quinlan, J.R.: C4.5: Programs for Machine Learning, vol. 1. Morgan Kaufmann Publishers, Burlington (1992)

    Google Scholar 

Download references

Acknowledgments

This work was funded in part by the National Natural Science Foundation of China (No. 61572240, 61601202), Natural Science Foundation of Jiangsu Province (Grant No. BK20140571), the Open Project Program of the National Laboratory of Pattern Recognition (NLPR) (No. 201600005), and the Research Innovation Program for College Graduates of Jiangsu Province (Grant No. SJLX16_0440).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to XiangJun Shen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Cai, C., Wornyo, D.K., Wang, L., Shen, X. (2018). Building Weighted Classifier Ensembles Through Classifiers Pruning. In: Huet, B., Nie, L., Hong, R. (eds) Internet Multimedia Computing and Service. ICIMCS 2017. Communications in Computer and Information Science, vol 819. Springer, Singapore. https://doi.org/10.1007/978-981-10-8530-7_13

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-8530-7_13

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-8529-1

  • Online ISBN: 978-981-10-8530-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics