Skip to main content

Observations on Boosting Feature Selection

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 3541))

Abstract

This paper presents a study of the Boosting Feature Selection (BFS) algorithm [1], a method which incorporates feature selection into Adaboost. Such an algorithm is interesting as it combines the methods studied by Boosting and ensemble feature selection researchers.

Observations are made on generalisation, weighted error and error diversity to compare the algorithms performance to Adaboost while using a nearest mean base learner. Ensemble feature prominence is proposed as a stop criterion for ensemble construction. Its quality assessed using the former performance measures. BFS is found to compete with Adaboost in terms of performance, despite the reduced feature description for each base classifer. This is explained using weighted error and error diversity. Results show the proposed stop criterion to be useful for trading ensemble performance and complexity.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Tieu, K., Viola, P.: Boosting image retrieval. IEEE Conf. on Computer Vision and Pattern Recognition, 228–235 (2000)

    Google Scholar 

  2. Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In Proc. 13th International Conference on Machine Learning, pp. 148–156 (1996)

    Google Scholar 

  3. Rätsch, G., Warmuth, M.: Maximizing the margin with boosting. In: Proceedings of the 15th Annual Conference on Computational Learning Theory, pp. 334–350 (2002)

    Google Scholar 

  4. Ho, T.: The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20, 832–844 (1998)

    Article  Google Scholar 

  5. Cunningham, P., Carney, J.: Diversity versus quality in classification ensembles based on feature selection. In: Lopez de Mantaras, R., Plaza, E. (eds.) ECML 2000. LNCS (LNAI), vol. 1810, pp. 109–116. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  6. Bryll, R., Gutierrez-Osuna, F.Q.: Attribute bagging: improving accuracy of classifier ensembles by using random feature subsets. Pattern Recognition 36, 1291–1302 (2003)

    Article  MATH  Google Scholar 

  7. Guerra-Salcedo, C., Whitley, D.: Feature selection mechanisms for ensemble creation: a genetic search perspective. In: AAAI 1999 (1999)

    Google Scholar 

  8. Tsymbal, A., Pechenizkiy, M., Cunningham, P.: Diversity in search strategies for ensemble feature selection. Information Fusion 6, 83–98 (2005)

    Article  Google Scholar 

  9. Günter, S., Bunke, H.: Feature selection algorithms for the generation of multiple classifier systems and their application to handwritten word recognition. Pattern Recognition Letters 25, 1323–1336 (2004)

    Article  Google Scholar 

  10. Oza, N., Tumer, K.: Input decimation ensembles: Decorrelation through dimensionality reduction. In: Proc. of 2nd International Workshop on Multiple Classifier Systems, Cambridge, UK, pp. 238–247 (2001)

    Google Scholar 

  11. Brown, G., Wyatt, J., Harris, R., Yao, X.: Diversity creation methods: A survey and categorisation. Information Fusion 6, 5–20 (2005)

    Article  Google Scholar 

  12. Quinlan, J.R.: Bagging, boosting and c4.5. In: Proceedings of the Thirteenth National Conference on Artificial Intelligence, pp. 725–730 (1996)

    Google Scholar 

  13. Schapire, R., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Machine Learning 37, 297–336 (1999)

    Article  MATH  Google Scholar 

  14. Kuncheva, L., Whitaker, C.: Measures of diversity in classifier ensembles. Machine Learning 51, 181–207 (2003)

    Article  MATH  Google Scholar 

  15. Feiss, J.: Statistical methods for rates and proportions (1981)

    Google Scholar 

  16. Blake, C., Merz, C.: UCI repository of machine learning databases (1998)

    Google Scholar 

  17. Skurichina, M., Kuncheva, L.I., Duin, R.P.W.: Bagging and boosting for the nearest mean classifier: Effects of sample size on diversity and accuracy. In: Roli, F., Kittler, J. (eds.) MCS 2002. LNCS, vol. 2364, pp. 62–71. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  18. Freund, Y., Schapire, R.: A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 119–139 (1997)

    Google Scholar 

  19. Schapire, R., Freund, Y., Bartlett, P., Lee, W.: Boosting the margin: A new explanation for the effectiveness of voting methods. The Annuals of Statistics, 1651–1686 (1998)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Redpath, D.B., Lebart, K. (2005). Observations on Boosting Feature Selection. In: Oza, N.C., Polikar, R., Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2005. Lecture Notes in Computer Science, vol 3541. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11494683_4

Download citation

  • DOI: https://doi.org/10.1007/11494683_4

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-26306-7

  • Online ISBN: 978-3-540-31578-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics