Skip to main content

Disturbing Neighbors Ensembles for Linear SVM

  • Conference paper
Book cover Multiple Classifier Systems (MCS 2009)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 5519))

Included in the following conference series:

  • 2476 Accesses

Abstract

Ensembles need their base classifiers do not always agree for any prediction (diverse base classifiers). Disturbing Neighbors (\(\mathcal{DN}\)) is a method for improving the diversity of the base classifiers of any ensemble algorithm. \(\mathcal{DN}\) builds for each base classifier a set of extra features based on a 1-Nearest Neighbors (1-NN) output. These 1-NN are built using a small subset of randomly selected instances from the training dataset. \(\mathcal{DN}\) has already been proved successfully on unstable base classifiers (i.e. decision trees). This paper presents an experimental validation on 62 UCI datasets for standard ensemble methods using Support Vector Machines (SVM) with a linear kernel as base classifiers. SVMs are very stable, so it is hard to increase their diversity when they belong to an ensemble. However, experiments will show that \(\mathcal{DN}\) usually improves ensemble accuracy and base classifiers diversity.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)

    MATH  Google Scholar 

  2. Ho, T.K.: The random subspace method for constructing decision forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)

    Article  Google Scholar 

  3. Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: Thirteenth International Conference on Machine Learning, pp. 148–156. Morgan Kaufmann, San Francisco (1996)

    Google Scholar 

  4. Vapnik, V.N.: The Nature of Statistical Learning Theory (Information Science and Statistics). Springer, Heidelberg (1999)

    Google Scholar 

  5. Lin, C.: Liblinear (2008), http://mloss.org/software/view/61/

  6. Maudes, J., Rodríguez, J.J., García-Osorio, C.: Disturbing neighbors diversity for decision forests. In: Okun, O., Valentini, G. (eds.) Workshop on Supervised and Unsupervised Ensemble Methods and their Applications, SUEMA 2008, pp. 67–71 (2008)

    Google Scholar 

  7. Witten, I., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques, 2nd edn. Morgan Kaufmann, San Francisco (2005), http://www.cs.waikato.ac.nz/ml/weka

    MATH  Google Scholar 

  8. Webb, G.I.: Multiboosting: A technique for combining boosting and wagging. Machine Learning 40(2) (2000)

    Google Scholar 

  9. Platt, J.C.: Fast training of support vector machines using sequential minimal optimization. In: Advances in kernel methods: support vector learning, pp. 185–208. MIT Press, Cambridge (1999)

    Google Scholar 

  10. Domeniconi, C., Yan, B.: Nearest neighbor ensemble. In: ICPR, vol. (1), pp. 228–231 (2004)

    Google Scholar 

  11. Caprile, B., Merler, S., Furlanello, C., Jurman, G.: Exact bagging with k-nearest neighbour classifiers. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 72–81. Springer, Heidelberg (2004)

    Chapter  Google Scholar 

  12. Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36(1-2), 105–139 (1999)

    Article  Google Scholar 

  13. Asuncion, A., Newman, D.: UCI machine learning repository (2007), http://www.ics.uci.edu/~MLearn/MLRepository.html

  14. Dietterich, T.G.: Approximate statistical test for comparing supervised classification learning algorithms. Neural Computation 10(7), 1895–1923 (1998)

    Article  Google Scholar 

  15. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

  16. Margineantu, D.D., Dietterich, T.G.: Pruning adaptive boosting. In: Proc. 14th International Conference on Machine Learning, pp. 211–218. Morgan Kaufmann, San Francisco (1997)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Maudes, J., Rodríguez, J.J., García-Osorio, C. (2009). Disturbing Neighbors Ensembles for Linear SVM. In: Benediktsson, J.A., Kittler, J., Roli, F. (eds) Multiple Classifier Systems. MCS 2009. Lecture Notes in Computer Science, vol 5519. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02326-2_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-02326-2_20

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-02325-5

  • Online ISBN: 978-3-642-02326-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics