Skip to main content

Exact Bagging with k-Nearest Neighbour Classifiers

  • Conference paper
Multiple Classifier Systems (MCS 2004)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 3077))

Included in the following conference series:

Abstract

A formula is derived for the exact computation of Bagging classifiers when the base model adopted is k-Nearest Neighbour (k-NN). The formula, that holds in any dimension and does not require the extraction of bootstrap replicates, proves that Bagging cannot improve 1-Nearest Neighbour. It also proves that, for k > 1, Bagging has a smoothing effect on k-NN. Convergence of empirically bagged k-NN predictors to the exact formula is also considered. Efficient approximations to the exact formula are derived, and their applicability to practical cases is illustrated.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L.: Bagging Predictors. Machine Learning 24, 123–140 (1996)

    MATH  MathSciNet  Google Scholar 

  2. Duda, R.O., Hart, P.E.: Pattern Classification and Scene Analysis. John Wiley, New York (1973)

    MATH  Google Scholar 

  3. Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. In: Proc. 13th Int. Conf. Mach. Learn., pp. 146–148. Morgan Kaufmann, San Francisco (1996)

    Google Scholar 

  4. Freund, Y., Schapire, R.E.: A Decision-theoretic Generalization of Online Learning and an Application to Boosting. J. Comput. Syst. Sci. 1, 119–139 (1997)

    Article  MathSciNet  Google Scholar 

  5. Quinlan, J.R.: Bagging, Boosting, and C4.5. In: Proc. 13th Nat. Conf. on AI, pp. 725–730. AAAI Press/MIT Press (1996)

    Google Scholar 

  6. Dietterich, T.G.: Ensemble methods in machine learning. In: Multiple Classifier Systems, pp. 1–15. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  7. Buhlmann, P., Yu, B.: Analyzing Bagging. Annals of Statistics 30, 927–961 (2002)

    Article  MathSciNet  Google Scholar 

  8. Jiang, W.: Process Consistency for AdaBoost. Technical report, Dept. of Statistics, Northwestern University (2000)

    Google Scholar 

  9. Caprile, B., Merler, S., Furlanello, C.: Exact Bagging with k-Nearest Neighbour Classifiers. Technical report, IRST, 0106-03 (2001)

    Google Scholar 

  10. Efron, B., Tibshirani, R.J.: An Introduction to the Bootstrap. Chapman & Hall, New York (1993)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2004 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Caprile, B., Merler, S., Furlanello, C., Jurman, G. (2004). Exact Bagging with k-Nearest Neighbour Classifiers. In: Roli, F., Kittler, J., Windeatt, T. (eds) Multiple Classifier Systems. MCS 2004. Lecture Notes in Computer Science, vol 3077. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-25966-4_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-25966-4_7

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-22144-9

  • Online ISBN: 978-3-540-25966-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics