Skip to main content

Combination of Multiple Classifier Using Feature Space Partitioning

  • Conference paper
  • First Online:
Pattern Recognition (CCPR 2016)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 662))

Included in the following conference series:

  • 1732 Accesses

Abstract

Combination of Multiple Classifier has been consider as the approach of improving the classification performance. The popular diversification approach named local specialization is based on the simultaneous partitioning of the feature space and an assignment of a compound classifier to each of the sub-space. This paper presents a novel feature space partitioning algorithm for the combination of multiple classifier. The proposed method uses pairwise measure to get the diversity between classifiers and selects the complementary classifiers to get the pseudo labels. Based on the pseudo labels, it splits the feature space into constituents and selects the best classifier committee from the pool of available classifiers. The partitioning and selection are taken place simultaneously as part of a compound optimization process aimed at maximizing system performance. Evolutionary methods are used to find the optimal solution. The experimental results show the effectiveness and efficiency of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Dietterich, T.G.: Ensemble methods in machine learning. In: Kittler, J., Roli, F. (eds.) MCS 2000. LNCS, vol. 1857, pp. 1–15. Springer, Heidelberg (2000)

    Chapter  Google Scholar 

  2. Wozniak, M., Grana, M., Corchado, E.: A survey of multiple classifier systems as hybrid systems. Inf. Fusion 16, 3–17 (2014)

    Article  Google Scholar 

  3. Baruque, B., Porras, S.: Hybrid classification ensemble using topology-preserving clustering. New Gener. Comput. 29(3), 329–344 (2011)

    Article  Google Scholar 

  4. Kuncheva, L.: Clustering-and-selection model for classifier combination. In: The Fourth International Conference on Knowledge-Based Intelligent Engineering Systems and Allied Technologies, vol. 1, pp. 185–188 (2000)

    Google Scholar 

  5. Jackowski, K., Wozniak, M.: Algorithm of designing compound recognition system on the basis of combining classifiers with simultaneous splitting feature space into competence areas. Pattern Anal. Appl. 12(4), 415–425 (2009)

    Article  MathSciNet  Google Scholar 

  6. Wozniak, M., Krawczyk, B.: Combined classifier based on feature space partitioning. Int. J. Appl. Math. Comput. Sci. 22(4), 855–866 (2012)

    Article  MathSciNet  Google Scholar 

  7. Bai, Q.X., Lam, H., Sclaroff, S.: A Bayesian framework for online classifier ensemble. In: The 31st International Conference on Machine Learning, pp. 1584–1592, Beijing, China (2014)

    Google Scholar 

  8. Gama, J., Zliobaite, I., Bifet, A., Pechenizkiy, M., Bouchachia, A.: A survey on concept drift adaptation. ACM Comput. Surv. 46(4), 44 (2014)

    Article  MATH  Google Scholar 

  9. Garcia, S., Luengo, J., Sez, J., Lpez, V., Herrera, F.: A survey of discretization techniques: taxonomy and empirical analysis in supervised learning. IEEE Trans. Knowl. Data Eng. 25, 734–750 (2013)

    Article  Google Scholar 

  10. Ferreira, A., Figueiredo, M.: An supervised approach to feature discretization and selection. Pattern Recogn. 45, 3048–3060 (2012)

    Article  Google Scholar 

  11. Dietterich, T.: An experimental comparison of three methods for constructing ensembles of decision trees: bagging, boosting and randomization. Mach. Learn. 40(1), 1–22 (2000)

    Google Scholar 

  12. Kuncheva, L., Whitaker, C.: Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach. Learn. 51, 181–207 (2003)

    Article  MATH  Google Scholar 

  13. Tang, E.K., Suganthan, P.N., Yao, X.: An analysis of diversity measures. Mach. Learn. 65, 247–271 (2006)

    Article  Google Scholar 

  14. Breiman, L.: Bagging predictors. Mach. Learn. 24(2), 123–140 (1996)

    MathSciNet  MATH  Google Scholar 

  15. Liu, Y., Yao, X., Higuchi, T.: Evolutionary ensembles with negative correlation learning. IEEE Trans. Evol. Comput. 4, 380–387 (2000)

    Article  Google Scholar 

  16. Brown, G., Wyatt, J., Harris, R., Yao, X.: Diversity creation methods: a survey and categorization. J. Inf. Fusion 6(1), 5–20 (2005)

    Article  Google Scholar 

  17. Kuncheva, L., Whitaker, C.: Measures of diversity in classifier ensembles and their relationship with the ensemble accuracy. Mach. Learn. 51, 181–207 (2003)

    Article  MATH  Google Scholar 

  18. Ho, T.: The random space method for constructing decision forests. IEEE Trans. Pattern Anal. Mach. Intell. 20(8), 832–844 (1998)

    Article  Google Scholar 

  19. Skalak, D.: The sources of increased accuracy for two proposed boosting algorithms. In: Proceedings of American Association for Artificial Intelligence, AAAI-96, Integrating Multiple Learned Models Workshop

    Google Scholar 

  20. Kuncheva, L.: Combining Pattern Classifiers: Methods and Algorithms, 2edn., pp. 248–280. Wiley Hoboken (2014)

    Google Scholar 

  21. Shawe-Taylor, J., Cristianini, N.: Robust bounds on generalization from the margin distribution. In: The 4th European Conference on Computational Learning Theory (1999)

    Google Scholar 

  22. Freund, Y., Schapire, R.E.: A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci. 55(1), 119–139 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  23. Li, L., Hu, Q., Wu, X., Yu, D.: Exploration of classification confidence in ensemble learning. Pattern Recogn. 47(9), 3120–3131 (2014)

    Article  Google Scholar 

  24. Quinlan, J.R.: Bagging, boosting, and C4. 5. In: AAAI/IAAI, vol. 1, pp. 725–730 (1996)

    Google Scholar 

  25. Schapire, R.E., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Mach. Learn. 37(3), 297–336 (1999)

    Article  MATH  Google Scholar 

  26. Schapire, R.E., Freund, Y., Bartlett, P., Lee, W.S.: Boosting the margin: a new explanation for the effectiveness of voting methods. Ann. Stat. 26, 1651–1686 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  27. Gabrys, B., Ruta, D.: Genetic algorithm in classifier fusion. Appl. Soft Comput. J. 6(4), 337–347 (2006)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xia Yingju .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer Nature Singapore Pte Ltd.

About this paper

Cite this paper

Yingju, X., Cuiqin, H., Jun, S. (2016). Combination of Multiple Classifier Using Feature Space Partitioning. In: Tan, T., Li, X., Chen, X., Zhou, J., Yang, J., Cheng, H. (eds) Pattern Recognition. CCPR 2016. Communications in Computer and Information Science, vol 662. Springer, Singapore. https://doi.org/10.1007/978-981-10-3002-4_50

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-3002-4_50

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-3001-7

  • Online ISBN: 978-981-10-3002-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics