Skip to main content

Ensemble Pruning via Base-Classifier Replacement

  • Conference paper
Web-Age Information Management (WAIM 2011)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 6897))

Included in the following conference series:

Abstract

Ensemble pruning is a technique to increase ensemble accuracy and reduce its size by choosing an optimal or suboptimal subset of ensemble members to form subensembles for prediction. A number of greedy ensemble pruning methods that are based on greedy search policy have recently been proposed. In this paper, we contribute a new greedy ensemble pruning method, called EPR, based on replacement policy. Unlike traditional pruning methods, EPR searches for the optimal or suboptimal subensemble with predefined size by iteratively replacing the least important classifier in it with current classifier. Especially, replacement would not occur if the current classifier was the least important one. Also, we adopt diversity measure [1] to theoretically analyze the properties of EPR, based on which a new metric is proposed to guide EPR’s search process. We evaluate the performance of EPR by comparing it with other advanced greedy ensemble pruning methods and obtain very promising results.

Supported by the National Science Foundation of China (No. 60901078).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.00
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Ho, T.K.: The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)

    Article  Google Scholar 

  2. Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. John Wiley and Sons, Chichester (2004)

    Book  MATH  Google Scholar 

  3. Breiman, L.: Bagging predictors. Machine Learning 24, 123–140 (1996)

    MATH  Google Scholar 

  4. Freund, Y., Schapire, R.F.: A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  5. Breiman, L.: Random forests. Machine Learning 45(1), 5–32 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  6. Rodríguez, J.J., Kuncheva, L.I., Alonso, C.J.: Rotation forest: A new classifier ensemble method. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(10), 1619–1630 (2006)

    Article  Google Scholar 

  7. Zhang, D., Chen, S., Zhou, Z., Yang, Q.: Constraint projections for ensemble learning. In: Proceedings of the Twenty-Third AAAI Conference on Artificial Intelligence (AAAI 2008), pp. 758–763 (2008)

    Google Scholar 

  8. Zhou, Z.H., Wu, J., Tang, W.: Ensembling neural networks: Many could be better than all. Artificial Intelligence 137(1-2), 239–263 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  9. Zhang, Y., Burer, S., Street, W.N.: Ensemble pruning via semi-definite programming. Journal of Machine Learning Research 7, 1315–1338 (2006)

    MathSciNet  MATH  Google Scholar 

  10. Margineantu, D.D., Dietterich, T.G.: Pruning adaptive boosting. In: Proceedings of the 14th International Conference on Machine Learning, pp. 211–218 (1997)

    Google Scholar 

  11. Tamon, C., Xiang, J.: On the Boosting Pruning Problem. In: Lopez de Mantaras, R., Plaza, E. (eds.) ECML 2000. LNCS (LNAI), vol. 1810, pp. 404–412. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  12. Fan, W., Chun, F., Wang, H.X., Yu, P.S.: Pruning and dynamic scheduling of cost-sensitive ensembles. In: Proceeding of Eighteenth National Conference on Artificial intelligence, AAAI, pp. 145–151 (2002)

    Google Scholar 

  13. Caruana, R., Niculescu-Mizil, A., Crew, G., Ksikes, A.: Ensemble Selection from Librariries of Models. In: Proceedings of the Twenty-First International Conference (2004)

    Google Scholar 

  14. Martinez-Muverbnoz, G., Suarez, A.: Aggregation ordering in bagging. In: Proceeding of The International Conference on Artificial Intelligence and Applications (IASTED), pp. 258–263. Acta press, Calgary (2004)

    Google Scholar 

  15. Martinez-Muverbnoz, G., Suarez, A.: Pruning in ordered bagging ensembles. In: Proceeding of the 23rd International Conference on Machine Learning, pp. 609–616 (2006)

    Google Scholar 

  16. Lu, Z.Y., Wu, X.D., Zhu, X.Q., Bongard, J.: Ensemble Pruning via Individual Contribution Ordering. In: Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 871–880 (2010)

    Google Scholar 

  17. Banfield, R.E., Hall, L.O., Bowyer, K.W., Kegelmeyer, W.P.: Ensemble diversity measures and their application to thinning. Information Fusion 6(1), 49–62 (2005)

    Article  Google Scholar 

  18. Partalas, I., Tsoumakas, G., Vlahavas, I.P.: Focused Ensemble Selection: A Diversity-Based Method for Greedy Ensemble Selection. In: 18th European Conference on Artificial Intelligence, pp. 117–121 (2008)

    Google Scholar 

  19. Partalas, I., Tsoumakas, G., Vlahavas, I.P.: An ensemble uncertainty aware measure for directed hill climbing ensemble pruning. Machine Learning, 257–282 (2010)

    Google Scholar 

  20. Kuncheva, L.I., Whitaker, C.J.: Measures of diversity in classifier ensemble and their relationship with the ensemble accuracy. Machine Learning 15(2), 181–207 (2003)

    Article  MATH  Google Scholar 

  21. Asuncion, D.N.A.: UCI machine learning repository (2007)

    Google Scholar 

  22. Quinlan, J.R.: C4.5: programs for machine learning. Morgan Kaufmann, San Francisco (1993)

    Google Scholar 

  23. Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques, 2nd edn. Morgan Kaufmann, San Francisco (2005)

    MATH  Google Scholar 

  24. Demsar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Guo, H., Fan, M. (2011). Ensemble Pruning via Base-Classifier Replacement. In: Wang, H., Li, S., Oyama, S., Hu, X., Qian, T. (eds) Web-Age Information Management. WAIM 2011. Lecture Notes in Computer Science, vol 6897. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-23535-1_43

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-23535-1_43

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-23534-4

  • Online ISBN: 978-3-642-23535-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics