Skip to main content

Multistrategy Ensemble Learning

  • Reference work entry
  • First Online:
Encyclopedia of Machine Learning and Data Mining
  • 44 Accesses

Definition

Every ensemble learning strategy might be expected to have unique effects on the base learner. Combining multiple ensemble learning algorithms might hence be expected to provide benefit. For example, Multi-Boosting combines AdaBoost and a variant of Bagging, obtaining most of AdaBoost’s bias reduction coupled with most of Bagging’s variance reduction. Similarly, Random Forests combines Bagging’s variance reduction with Random Subspaces’ bias reduction.

Cross-References

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 699.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 949.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Recommended Reading

  • Webb GI, Zheng Z (2004) Multistrategy ensemble learning: reducing error by combining ensemble learning techniques. IEEE Trans Knowl Data Eng 16(8): 980–991

    Article  Google Scholar 

Download references

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer Science+Business Media New York

About this entry

Cite this entry

(2017). Multistrategy Ensemble Learning. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning and Data Mining. Springer, Boston, MA. https://doi.org/10.1007/978-1-4899-7687-1_574

Download citation

Publish with us

Policies and ethics