Abstract
Many theoretical or experimental studies has shown that ensemble learning is an effective technique to achieve better classification accuracy and stability than individual classifiers. In this paper, we propose a novel weighted classifier ensemble method through classifiers pruning with two stages. In the first stage, we use canonical correlation analysis (CCA) to model maximum correlation relationships between training data points and base classifiers. Based on such globally multi-linear projections, a sparse regression method is proposed to prune base classifiers so that each test data point will dynamically select a subset of classifiers to form a unique classifier ensemble, to decrease effects of noisy input data and incorrect classifiers in such a global view. In the second stage, the pruned classifiers are weighted locally by a fusion method, which utilizes the generalization ability of pruned classifiers among nearest neighbors of testing data points. By this way, each test data point can build a unique locally weighted classifier ensemble. Analysis of experimental results on several UCI data sets shows that the classification results of our method are better than other ensemble methods such as Random Forests, Majority Voting, AdaBoost and DREP.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Zhou, Z.H.: Ensemble Methods: Foundations and Algorithms. Taylor & Francis, Abingdon (2012)
Liu, J., Shang, S., Zheng, K., Wen, J.R.: Multi-view ensemble learning for dementia diagnosis from neuroimaging: an artificial neural network approach. Neurocomputing 195, 112–116 (2016)
Ghorai, S., Mukherjee, A., Sengupta, S., Dutta, P.K.: Cancer classification from gene expression data by NPPC ensemble. IEEE/ACM Trans. Comput. Biol. Bioinform. 8(3), 659–671 (2010)
Liu, L., Shao, L., Rockett, P.: Boosted key-frame selection and correlated pyramidal motion-feature representation for human action recognition. Pattern Recogn. 46(7), 1810–1818 (2013)
Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)
Schapire, R.E.: The Boosting Approach to Machine Learning: An Overview. Springer, New York (2002). https://doi.org/10.1007/978-0-387-21579-2_9
Liaw, A., Wiener, M.: Classification and regression by randomforest. R News 2(3), 18–22 (2002)
Tsoumakas, G., Partalas, I., Vlahavas, I.: An ensemble pruning primer. Stud. Comput. Intell. 245, 1–13 (2009)
Martinezmuoz, G., Hernandezlobato, D., Suarez, A.: An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Trans. Pattern Anal. Mach. Intell. 31(2), 245–259 (2009)
Li, N., Yu, Y., Zhou, Z.-H.: Diversity regularized ensemble pruning. In: Flach, P.A., De Bie, T., Cristianini, N. (eds.) ECML PKDD 2012. LNCS (LNAI), vol. 7523, pp. 330–345. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-33460-3_27
Zhang, L., Zhou, W.D.: Sparse ensembles using weighted combination methods based on linear programming. Pattern Recogn. 44(1), 97–106 (2011)
Liu, T., Tao, D.: Classification with noisy labels by importance reweighting. IEEE Trans. Pattern Anal. Mach. Intell. 38(3), 447–461 (2016)
Mao, S., Jiao, L., Xiong, L., Gou, S., Chen, B., Yeung, S.K.: Weighted classifier ensemble based on quadratic form. Pattern Recogn. 48(5), 1688–1706 (2014)
Hardoon, D.R., Szedmak, S., Shawetaylor, J.: Canonical correlation analysis: an overview with application to learning methods. Neural Comput. 16(12), 2639–2664 (2004)
Blake, C.: UCI repository of machine learning databases (1998). http://www.ics.uci.edu/~mlearn/MLRepository.html
Quinlan, J.R.: C4.5: Programs for Machine Learning, vol. 1. Morgan Kaufmann Publishers, Burlington (1992)
Acknowledgments
This work was funded in part by the National Natural Science Foundation of China (No. 61572240, 61601202), Natural Science Foundation of Jiangsu Province (Grant No. BK20140571), the Open Project Program of the National Laboratory of Pattern Recognition (NLPR) (No. 201600005), and the Research Innovation Program for College Graduates of Jiangsu Province (Grant No. SJLX16_0440).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Cai, C., Wornyo, D.K., Wang, L., Shen, X. (2018). Building Weighted Classifier Ensembles Through Classifiers Pruning. In: Huet, B., Nie, L., Hong, R. (eds) Internet Multimedia Computing and Service. ICIMCS 2017. Communications in Computer and Information Science, vol 819. Springer, Singapore. https://doi.org/10.1007/978-981-10-8530-7_13
Download citation
DOI: https://doi.org/10.1007/978-981-10-8530-7_13
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-8529-1
Online ISBN: 978-981-10-8530-7
eBook Packages: Computer ScienceComputer Science (R0)