Abstract
We look at three variants of the boosting algorithm called here Aggressive Boosting, Conservative Boosting and Inverse Boosting. We associate the diversity measure Q with the accuracy during the progressive development of the ensembles, in the hope of being able to detect the point of “paralysis” of the training, if any. Three data sets are used: the artificial Cone-Torus data and the UCI Pima Indian Diabetes data and the Phoneme data. We run each of the three Boosting variants with two base classifier models: the quadratic classifier and a multi-layer perceptron (MLP) neural network. The three variants show different behavior, favoring in most cases the Conservative Boosting.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
E. Bauer and R. Kohavi. An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning, 36:105–142, 1999.
P. Cunningham and J. Carney. Diversity versus quality in classification ensembles based on feature selection. Technical Report TCD-CS-2000-02, Department of Computer Science, Trinity College Dublin, 2000.
T.G. Dietterich. Ensemble methods in machine learning. In J. Kittler and F. Roli, editors, Multiple Classifier Systems, volume 1857 of Lecture Notes in Computer Science, pages 1–15, Cagliari, Italy, 2000. Springer.
R.O. Duda, P.E. Hart, and D.G. Stork. Pattern Classification. John Wiley & Sons, NY, second edition, 2001.
Y. Freund and R.E. Schapire. A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences, 55(1):119–139, 1997.
S. Hashem, B. Schmeiser, and Y. Yih. Optimal linear combinations of neural networks: an overview. In IEEE International Conference on Neural Networks, pages 1507–1512, Orlando, Florida, 1994.
A. Krogh and J. Vedelsby. Neural network ensembles, cross validation and active learning. In G. Tesauro, D.S. Touretzky, and T.K. Leen, editors, Advances in Neural Information Processing Systems, volume 7, pages 231–238. MIT Press, Cambridge, MA, 1995.
L.I. Kuncheva. Fuzzy Classifier Design. Studies in Fuzziness and Soft Computing. Springer Verlag, Heidelberg, 2000.
L.I. Kuncheva and C. J. Whitaker. Ten measures of diversity in classifier ensembles: limits for two classifiers. In Proc. IEE Workshop on Intelligent Sensor Processing, pages 10/1–10/6, Birmingham, February 2001. IEE.
L.I. Kuncheva, C.J. Whitaker, C.A. Shipp, and R.P.W. Duin. Is independence good for combining classifiers? In Proc. 15th International Conference on Pattern Recognition, volume 2, pages 169–171, Barcelona, Spain, 2000.
L. Lam. Classifier combinations: implementations and theoretical issues. In J. Kittler and F. Roli, editors, Multiple Classifier Systems, volume 1857 of Lecture Notes in Computer Science, pages 78–86, Cagliari, Italy, 2000. Springer.
B.E. Rosen. Ensemble learning using decorrelated neural networks. Connection Science, 8(3/4):373–383, 1996.
R.E. Schapire. Theoretical views of boosting. In Proc. 4th European Conference on Computational Learning Theory, pages 1–10, 1999.
C.A. Shipp and L.I. Kuncheva. Relationships between combination methods and measures of diversity in combining classifiers. Information Fusion. (accepted).
P.H.A. Sneath and R.R. Sokal. Numerical Taxonomy. W.H. Freeman & Co, 1973.
K. Tumer and J. Ghosh. Error correlation and error reduction in ensemble classifiers. Connection Science, 8(3/4):385–404, 1996.
J. Wickramaratna, S. Holden, and B. Buxton. Performance degradation in boosting. In J. Kittler and F. Roli, editors, Proc. Second International Workshop on Multiple Classifier Systems, volume 2096 of Lecture Notes in Computer Science, pages 11–21, Cambridge, UK, 2001. Springer-Verlag.
G.U. Yule. On the association of attributes in statistics. Phil. Trans., A, 194:257–319, 1900.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Kuncheva, L.I., Whitaker, C.J. (2002). Using Diversity with Three Variants of Boosting: Aggressive, Conservative, and Inverse. In: Roli, F., Kittler, J. (eds) Multiple Classifier Systems. MCS 2002. Lecture Notes in Computer Science, vol 2364. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45428-4_8
Download citation
DOI: https://doi.org/10.1007/3-540-45428-4_8
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-43818-2
Online ISBN: 978-3-540-45428-1
eBook Packages: Springer Book Archive