Skip to main content

Beam Search Extraction and Forgetting Strategies on Shared Ensembles

  • Conference paper
  • First Online:

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2709))

Abstract

Ensemble methods improve accuracy by combining the predictions of a set of different hypotheses. However, there is an important shortcoming associated with ensemble methods. Huge amounts of memory are required to store a set of multiple hypotheses. In this work, we have devised an ensemble method that partially solves this problem. The key point is that components share their common parts. We employ a multi-tree, which is a structure that can simultaneously contain an ensemble of decision trees but has the advantage that decision trees share some conditions. To construct this multi-tree, we define an algorithm based on a beam search with several extraction criteria and with several forgetting policies for the suspended nodes. Finally, we compare the behaviour of this ensemble method with some well-known methods for generating hypothesis ensembles.

This work has been partially supported by CICYT under grant TIC2001-2705-C03-01 and Acción Integrada Hispano-Austríaca HA2001-0059.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. C.L. Blake and C.J. Merz. UCI repository of machine learning databases, 1998.

    Google Scholar 

  2. L. Breiman. Bagging predictors. Machine Learning, 24(2):123–140, 1996.

    MATH  MathSciNet  Google Scholar 

  3. W. Buntine. A Theory of Learning Classification Rules. PhD thesis, School of Computing Science in the University of Technology, Sydney, February 1990.

    Google Scholar 

  4. W. Buntine. Learning classification trees. In D. J. Hand, editor, Artificial Intelligence frontiers in statistics, pages 182–201. Chapman & Hall, London, 1993.

    Google Scholar 

  5. P. Clark and T. Niblett. The CN2 induction algorithm. Machine Learning, 3:261–283, 1989.

    Google Scholar 

  6. T. Dean and M. Boddy. An analysis of time-dependent planning. In Proc. of the 7th National Conference on Artificial Intelligence, pages 49–54, 1988.

    Google Scholar 

  7. T. G Dietterich. Ensemble methods in machine learning. In First International Workshop on Multiple Classifier Systems, pages 1–15, 2000.

    Google Scholar 

  8. T. G. Dietterich. An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, Boosting, and Randomization. Machine Learning, 40(2):139–157, 2000.

    Article  Google Scholar 

  9. V. Estruch, C. Ferri, J. Hernández, and M. J. Ramírez. SMILES: A multi-purpose learning system. In Logics in Artificial Intelligence, European Conference, JELIA, volume 2424 of Lecture Notes in Computer Science, pages 529–532, 2002.

    Chapter  Google Scholar 

  10. V. Estruch, C. Ferri, J. Hernández, and M.J. Ramírez. Shared Ensembles using Multi-trees. In the 8th Iberoamerican Conference on Artificial. Intelligence, Iberamia’02, volume 2527 of Lecture Notes in Computer Science, pages 204–213, 2002.

    Google Scholar 

  11. Y. Freund and R.E. Schapire. Experiments with a new boosting algorithm. In Proc. 13th International Conference on Machine Learning, pages 148–146. Morgan Kaufmann, 1996.

    Google Scholar 

  12. R. Kohavi and C. Kunz. Option decision trees with majority votes. In Proc. 14th International Conference on Machine Learning, pages 161–169. Morgan Kaufmann, 1997.

    Google Scholar 

  13. L. Kuncheva. A Theoretical Study on Six Classifier Fusion Strategies. IEEE Trans. on Pattern Analysis and Machine Intelligence, 24(2):281–286, 2002.

    Article  Google Scholar 

  14. L. Kuncheva and C. J. Whitaker. Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Submitted to Machine Learning, 2002.

    Google Scholar 

  15. N.J. Nilsson. Artificial Intelligence: a new synthesis. Morgan Kaufmann, 1998.

    Google Scholar 

  16. J. R. Quinlan. C4.5: Programs for Machine Learning. Morgan Kaufmann, San Mateo, CA, 1993.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Estruch, V., Ferri, C., Hernández-Orallo, J., Ramírez-Quintana, M.J. (2003). Beam Search Extraction and Forgetting Strategies on Shared Ensembles. In: Windeatt, T., Roli, F. (eds) Multiple Classifier Systems. MCS 2003. Lecture Notes in Computer Science, vol 2709. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44938-8_21

Download citation

  • DOI: https://doi.org/10.1007/3-540-44938-8_21

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-40369-2

  • Online ISBN: 978-3-540-44938-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics