Skip to main content

Local Decision Bagging of Binary Neural Classifiers

  • Conference paper
Advances in Artificial Intelligence (Canadian AI 2008)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 5032))

  • 1456 Accesses

Abstract

Bagging as well as other classifier ensembles have made possible a performance improvement in many pattern recognition problems for the last decade. A careful analysis of previous work points out, however, that the most significant advance of bagged neural networks is achieved for multiclass problems, whereas the binary classification problems seldom benefit from the classifier combination. Focusing on the binary classification applications, this paper evaluates the standard bagging approach and explores a novel variant, local bagging, that, while keeping the standard individual classifier generation, attempts to improve its decision combination stage by (a) dynamically selecting of a set of individual classifiers and (b) subsequently weighting them by their local accuracy. Experimental results carried out on standard benchmark data sets with Neural Networks, SVMs, Naive Bayes, C4.5 Decision Trees and Decision Stumps as the base classifier, show that local bagging yields significant improvements in these practical applications and is found more stable than Adaboost.

This work has been partially supported by the Spanish MEC project DPI2006-02550.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Alaiz-Rodríguez, R., Guerrero-Curieses, A., Cid-Sueiro, J.: Minimax classifiers based on neural networks. Pattern Recognition 38(1), 29–39 (2005)

    Article  MATH  Google Scholar 

  2. Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning 36(1-2), 105–139 (1999)

    Article  Google Scholar 

  3. Blake, C.L., Merz, C.J.: UCI repository of machine learning databases (1998)

    Google Scholar 

  4. Breiman, L.: Bagging predictors. Machine Learning 24(2), 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  5. Dietterich, T.G.: An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine Learning 40(2), 139–157 (2000)

    Article  Google Scholar 

  6. Freund, Y.: Boosting a weak learning algorithm by majority. Inf. Comput. 121(2), 256–285 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  7. Ha, K., Cho, S., MacLachlan, D.: Response models based on bagging neural networks. Journal of Interactive Marketing 19(1), 17–30 (2005)

    Article  Google Scholar 

  8. Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Trans. Pattern Anal. Mach. Intell. 12(10), 993–1001 (1990)

    Article  Google Scholar 

  9. Hothorn, T., Lausen, B.: Bagging tree classifiers for laser scanning images: a data- and simulation-based strategy. Artificial Intelligence in Medicine 27(1), 65–79 (2003)

    Article  Google Scholar 

  10. Kim, H.C., Pang, S., Je, H.M., Kim, D., Bang, S.Y.: Pattern classification using support vector machine ensemble. In: 16th International Conference on Pattern Recognition, p. 20160. IEEE Computer Society Press, Los Alamitos (2002)

    Google Scholar 

  11. Kotsiantis, S.B., Tsekouras, G.E., Pintelas, P.E.: Local bagging of decision stumps. In: Proceedings of the 18th international conference on Innovations in Applied Artificial Intelligence, pp. 406–411. Springer, London (2005)

    Google Scholar 

  12. Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley-Interscience, Chichester (2004)

    MATH  Google Scholar 

  13. Maclin, R.: Boosting classifiers regionally. In: Proceedings of the fifteenth AAAI/ tenth IAAI, pp. 700–705 (1998)

    Google Scholar 

  14. Opitz, D., Maclin, R.: Popular ensemble methods: An empirical study. Journal of Artificial Intelligence Research 11, 169–198 (1999)

    MATH  Google Scholar 

  15. Opitz, D.W., Maclin, R.F.: An empirical evaluation of bagging and boosting for artificial neural networks. In: International Conference on Neural Networks, pp. 1401–1405 (1997)

    Google Scholar 

  16. Schapire, R.E.: The strength of weak learnability. Machine Learning 5(2), 197–227 (1990)

    Google Scholar 

  17. Witten, I.H., Frank, E.: Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations. Morgan Kaufmann, San Francisco (1999)

    Google Scholar 

  18. Woods, K., Kegelmeyer, W.P., Bowyer, K.: Combination of multiple classifiers using local accuracy estimates. IEEE Trans. Pattern Anal. Mach. Intell. 19(4), 405–410 (1997)

    Article  Google Scholar 

  19. Zhou, Z.-H., Wu, J., Tang, W.: Ensembling neural networks: many could be better than all. Artif. Intell. 137(1-2), 239–263 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  20. Zhou, Z.H., Tang, W.: Selective ensemble of decision trees. In: Wang, G., Liu, Q., Yao, Y., Skowron, A. (eds.) RSFDGrC 2003. LNCS (LNAI), vol. 2639, pp. 476–483. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Sabine Bergler

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Alaiz-Rodríguez, R. (2008). Local Decision Bagging of Binary Neural Classifiers. In: Bergler, S. (eds) Advances in Artificial Intelligence. Canadian AI 2008. Lecture Notes in Computer Science(), vol 5032. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-68825-9_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-68825-9_1

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-68821-1

  • Online ISBN: 978-3-540-68825-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics