Abstract
Combination of Multiple Classifier has been consider as the approach of improving the classification performance. The popular diversification approach named local specialization is based on the simultaneous partitioning of the feature space and an assignment of a compound classifier to each of the sub-space. This paper presents a novel feature space partitioning algorithm for the combination of multiple classifier. The proposed method uses pairwise measure to get the diversity between classifiers and selects the complementary classifiers to get the pseudo labels. Based on the pseudo labels, it splits the feature space into constituents and selects the best classifier committee from the pool of available classifiers. The partitioning and selection are taken place simultaneously as part of a compound optimization process aimed at maximizing system performance. Evolutionary methods are used to find the optimal solution. The experimental results show the effectiveness and efficiency of the proposed method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Dietterich, T.G.: Ensemble methods in machine learning. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 1–15. Springer, Heidelberg (2000)
Wozniak, M., Grana, M., Corchado, E.: A survey of multiple classifier systems as hybrid systems. Inf. Fusion 16, 3–17 (2014)
Baruque, B., Porras, S.: Hybrid classification ensemble using topology-preserving clustering. New Gener. Comput. 29(3), 329–344 (2011)
Kuncheva, L.: Clustering-and-selection model for classifier combination. In: The Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies, vol. 1, pp. 185–188 (2000)
Jackowski, K., Wozniak, M.: Algorithm of designing compound recognition system on the basis of combining classifiers with simultaneous splitting feature space into competence areas. Pattern Anal. Appl. 12(4), 415–425 (2009)
Wozniak, M., Krawczyk, B.: Combined classifier based on feature space partitioning. Int. J. Appl. Math. Comput. Sci. 22(4), 855–866 (2012)
Bai, Q.X., Lam, H., Sclaroff, S.: A Bayesian framework for online classifier ensemble. In: The 31st International Conference on Machine Learning, pp. 1584–1592, Beijing, China (2014)
Gama, J., Zliobaite, I., Bifet, A., Pechenizkiy, M., Bouchachia, A.: A survey on concept drift adaptation. ACM Comput. Surv. 46(4), 44 (2014)
Garcia, S., Luengo, J., Sez, J., Lpez, V., Herrera, F.: A survey of discretization techniques: taxonomy and empirical analysis in supervised learning. IEEE Trans. Knowl. Data Eng. 25, 734–750 (2013)
Ferreira, A., Figueiredo, M.: An supervised approach to feature discretization and selection. Pattern Recogn. 45, 3048–3060 (2012)
Dietterich, T.: An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting and randomization. Mach. Learn. 40(1), 1–22 (2000)
Kuncheva, L., Whitaker, C.: Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach. Learn. 51, 181–207 (2003)
Tang, E.K., Suganthan, P.N., Yao, X.: An analysis of diversity measures. Mach. Learn. 65, 247–271 (2006)
Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)
Liu, Y., Yao, X., Higuchi, T.: Evolutionary ensembles with negative correlation learning. IEEE Trans. Evol. Comput. 4, 380–387 (2000)
Brown, G., Wyatt, J., Harris, R., Yao, X.: Diversity creation methods: a survey and categorization. J. Inf. Fusion 6(1), 5–20 (2005)
Kuncheva, L., Whitaker, C.: Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach. Learn. 51, 181–207 (2003)
Ho, T.: The random space method for constructing decision forests. IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 832–844 (1998)
Skalak, D.: The sources of increased accuracy for two proposed boosting algorithms. In: Proceedings of American Association for Artificial Intelligence, AAAI-96, Integrating Multiple Learned Models Workshop
Kuncheva, L.: Combining Pattern Classifiers: Methods and Algorithms, 2edn., pp. 248–280. Wiley Hoboken (2014)
Shawe-Taylor, J., Cristianini, N.: Robust bounds on generalization from the margin distribution. In: The 4th European Conference on Computational Learning Theory (1999)
Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55(1), 119–139 (1997)
Li, L., Hu, Q., Wu, X., Yu, D.: Exploration of classification confidence in ensemble learning. Pattern Recogn. 47(9), 3120–3131 (2014)
Quinlan, J.R.: Bagging, boosting, and C4. 5. In: AAAI/IAAI, vol. 1, pp. 725–730 (1996)
Schapire, R.E., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Mach. Learn. 37(3), 297–336 (1999)
Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.S.: Boosting the margin: a new explanation for the effectiveness of voting methods. Ann. Stat. 26, 1651–1686 (1998)
Gabrys, B., Ruta, D.: Genetic algorithm in classifier fusion. Appl. Soft Comput. J. 6(4), 337–347 (2006)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Yingju, X., Cuiqin, H., Jun, S. (2016). Combination of Multiple Classifier Using Feature Space Partitioning. In: Tan, T., Li, X., Chen, X., Zhou, J., Yang, J., Cheng, H. (eds) Pattern Recognition. CCPR 2016. Communications in Computer and Information Science, vol 662. Springer, Singapore. https://doi.org/10.1007/978-981-10-3002-4_50
Download citation
DOI: https://doi.org/10.1007/978-981-10-3002-4_50
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-3001-7
Online ISBN: 978-981-10-3002-4
eBook Packages: Computer ScienceComputer Science (R0)