Definition
Every ensemble learning strategy might be expected to have unique effects on the base learner. Combining multiple ensemble learning algorithms might hence be expected to provide benefit. For example, Multi-Boosting combines AdaBoost and a variant of Bagging, obtaining most of AdaBoost’s bias reduction coupled with most of Bagging’s variance reduction. Similarly, Random Forests combines Bagging’s variance reduction with Random Subspaces’ bias reduction.
Cross-References
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Recommended Reading
Webb GI, Zheng Z (2004) Multistrategy ensemble learning: reducing error by combining ensemble learning techniques. IEEE Trans Knowl Data Eng 16(8): 980–991
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer Science+Business Media New York
About this entry
Cite this entry
(2017). Multistrategy Ensemble Learning. In: Sammut, C., Webb, G.I. (eds) Encyclopedia of Machine Learning and Data Mining. Springer, Boston, MA. https://doi.org/10.1007/978-1-4899-7687-1_574
Download citation
DOI: https://doi.org/10.1007/978-1-4899-7687-1_574
Published:
Publisher Name: Springer, Boston, MA
Print ISBN: 978-1-4899-7685-7
Online ISBN: 978-1-4899-7687-1
eBook Packages: Computer ScienceReference Module Computer Science and Engineering