Abstract
We proffer totally-corrective multi-class boosting algorithms in this work. First, we discuss the methods that extend two-class boosting to multi-class case by studying two existing boosting algorithms: AdaBoost.MO and SAMME, and formulate convex optimization problems that minimize their regularized cost functions. Then we propose a column-generation based totally-corrective framework for multi-class boosting learning by looking at the Lagrange dual problems. Experimental results on UCI datasets show that the new algorithms have comparable generalization capability but converge much faster than their counterparts. Experiments on MNIST handwriting digit classification also demonstrate the effectiveness of the proposed algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Freund, Y., Schapire, R.: A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting. J. Comp. Syst. Sci. 55, 119–139 (1997)
Viola, P., Jones, M.J.: Robust real-time face detection. Int. J. Comp. Vis. 57, 137–154 (2004)
Tieu, K., Viola, P.: Boosting image retrieval. Int. J. Comp. Vis. 56, 17–36 (2004)
Schapire, R., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Mach. Learn. 37, 297–336 (1999)
Schapire, R.: Using output codes to boost multiclass learning problems. Mach. Learn, 313–321 (1997)
Guruswami, V., Sahai, A.: Multiclass learning, boosting, and error-correcting codes. In: Proc. Annual Conf. Learn. Theory, pp. 145–155. ACM, New York (1999)
Zhu, J., Rosset, S., Zou, H., Hastie, T.: Multi-class adaboost. Ann Arbor 1001, 48109 (2006)
Shen, C., Li, H.: On the dual formulation of boosting algorithms. IEEE Trans. Pattern Anal. Mach. Intell., 1 (2010)
Hastie, J., Tibshirani, R.: Additive logistic regression: A statistical view of boosting. Ann. Statist. 28, 337–374 (2000)
Demiriz, A., Bennett, K., Shawe-Taylor, J.: Linear programming boosting via column generation. Mach. Learn. 46, 225–254 (2002)
Dietterich, T., Bakiri, G.: Solving Multiclass Learning Problems via Error-Correcting Output Codes. J. Artif. Intell. Res. 2, 263–286 (1995)
Sun, Y., Todorovic, S., Li, J.: Unifying multi-class adaboost algorithms with binary base learners under the margin framework. Pattern Recogn. Lett. 28, 631–643 (2007)
Boyd, S., Vandenberghe, L.: Convex optimization. Cambridge University Press, Cambridge (2004)
Mosek, A.: The mosek optimization software (2007)
Kivinen, J., Warmuth, M.: Boosting as entropy projection. In: Proc. Annual Conf. Learn, pp. 134–144. ACM, New York (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Hao, Z., Shen, C., Barnes, N., Wang, B. (2011). Totally-Corrective Multi-class Boosting. In: Kimmel, R., Klette, R., Sugimoto, A. (eds) Computer Vision – ACCV 2010. ACCV 2010. Lecture Notes in Computer Science, vol 6495. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-19282-1_22
Download citation
DOI: https://doi.org/10.1007/978-3-642-19282-1_22
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-19281-4
Online ISBN: 978-3-642-19282-1
eBook Packages: Computer ScienceComputer Science (R0)