Skip to main content
Log in

A multiple classifiers time-serial ensemble pruning algorithm based on the mechanism of forward supplement

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Because there are lots of typical applications and urgent needs, the research on the efficient classification learning about accumulated big data in nonstationary environments has become one of the hot topics in the field of data mining recently. The LearnNSE algorithm is an important research result in this field. For the long-term accumulated big data, the LearnNSE-Pruned-Age, a pruning version of LearnNSE, was given, which has received widespread attentions. However, it is found that the pruning mechanism of the LearnNSE-Pruned-Age algorithm is not perfect, which lost the core ability of the LearnNSE algorithm to reuse the learned classification knowledge. Therefore, the ensemble mechanism of LearnNSE is adjusted in this paper, and a novel ensemble mechanism is designed. The new mechanism uses the integration of the latest base-classifiers to track the changes of the data generation environment, and then selects the old base-classifiers that contribute to the current classification for forward supplementary integration. On this basis, a new pruned algorithm named FLearnNSE-Pruned-Age is proposed. The experiment results show that the FLearnNSE-Pruned-Age algorithm has the ability to reuse the learned classification knowledge and it can achieve the very close classification accuracy compared to LearnNSE, even better in some scenes. In addition, it improves the efficiency of ensemble learning and is suitable for the fast classification learning of accumulated big data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Duda P, Rutkowski L, Jaworski M, Rutkowska D (2020) On the Parzen kernel-based probability density function learning procedures over time-varying streaming data with applications to pattern classification[J]. IEEE Trans Cybern 50(4):1683–1696

    Article  Google Scholar 

  2. Alippi C, Qi W, Roveri M (2017) Learning in nonstationary environments: a hybrid approach[C]//proc of the 16th international conference on artificial intelligence and soft computing. Switzerland: Springer, 703–714

  3. Sayedmouchaweh M (2016) Learning from data streams in dynamic environments. Springer Verlag, Berlin

    Book  Google Scholar 

  4. Bilal M, Lin Z, Liu N (2015) Ensemble of subset online sequential extreme learning machine for class imbalance and concept drift[J]. Neurocomputing 149(Part A):316–329

    Google Scholar 

  5. Conca P, Timmis J de Lemos R et al (2015) An adaptive classification framework for unsupervised model updating in nonstationary environments[C]//prof of 1st international workshop on machine learning, optimization and big data. Berlin: Springer Verlag

  6. Alippi C, Roveri M et al (2015) Learning in nonstationary environments: a survey[J]. IEEE Comput Intell Mag 10(4):12–25

    Article  Google Scholar 

  7. Noda I (2016) Optimality and equilibrium of exploration ratio for multiagent learning in nonstationary environments[C]//proc of 16th international workshop on multi-agent systems and agent-based simulation. Berlin: Springer Verlag

  8. Shaker A, Huellermeler E (2015) Recovery analysis for adaptive learning from non-stationary data streams: Experimental design and case study [J]. Neurocomputing 150(Part A):250–264

    Article  Google Scholar 

  9. Elwell R, Polikar R (2011) Incremental learning of concept drift in nonstationary environments[J]. IEEE Trans Neural Netw 22(10):1517–1531

    Article  Google Scholar 

  10. Shen Y, Zhu YQ, Song XP (2017) Fast Learn++.NSE Algorithm Based on Sliding Window[J]. Pattern Recogniti Artif Intell 30(12):1083–1090

    Google Scholar 

  11. Mejri D, Limam M, Weihs C (2018) A new dynamic weighted majority control chart for data streams[J]. Soft Comput 22(2):511–522

    Article  Google Scholar 

  12. Nick SW, Kim YS (2001) A streaming ensemble algorithm (SEA) for large-scale classification[C]//Proc of the Seventh ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. New York, NY: ACM, 377–382

  13. Dai Q (2013) A competitive ensemble pruning approach based on cross-validation technique [J]. Knowl-Based Syst 37:394–414

    Article  Google Scholar 

  14. Goel K, Batra S (2021) Two-level pruning based ensemble with abstained learners for concept drift in data streams. Expert Syst 38:e12661. https://doi.org/10.1111/exsy.12661

    Article  Google Scholar 

  15. Zhu G, Dai Q (2020) EnsPKDE&IncLKDE: a hybrid time series prediction algorithm integrating dynamic ensemble pruning, incremental learning, and kernel density estimation[J]. Appl Intell 51(2):617–645

    Article  Google Scholar 

  16. Wang K, Lu J, Liu A, Zhang G, Xiong L (2021) Evolving Gradient Boost: A Pruning Scheme Based on Loss Improvement Ratio for Learning Under Concept Drift. IEEE Trans Cybern PP:1–14. https://doi.org/10.1109/TCYB.2021.3109796

    Article  Google Scholar 

  17. Chen Y, Zhu YQ, Chen HF, Shen Y, Xu Z (2021) A pruning optimized fast learn plus plus NSE algorithm[J]. IEEE ACCESS 9:150733–150743

    Article  Google Scholar 

  18. Elwell R, Polikar R (2009) Incremental learning in nonstationary environments with controlled forgetting. Int Joint Conf Neural Netw 2009:771–778

    Google Scholar 

  19. Cesare A, Giacomo B, Manuel R (2011) A just-in-time adaptive classification system based on the intersection of confidence intervals rule [J]. Neural Netw 24(8):791–800

    Article  Google Scholar 

  20. Huang Y, Tang J, Cheng Y et al (2016) Real-time detection of false data injection in smart grid networks: an adaptive CUSUM method and analysis[J]. IEEE Syst J 10(2):532–543

    Article  Google Scholar 

  21. Boracchi G, Roveri M (2014) A reconfigurable and element-wise ICI-based change-detection test for streaming data[C]//proc of 2014 IEEE international conference on computational intelligence and virtual environments for measurement systems and applications (CIVEMSA). New York, NY: IEEE, 58–63

  22. Cohen L, Avrahami G, Last M et al (2008) Info-fuzzy algorithms for mining dynamic data streams [J]. Appl Soft Comput J 8(4):1283–1294

    Article  Google Scholar 

  23. Luan SO, Batista GEAPA (2015) IGMM-CD: A Gaussian Mixture Classification Algorithm for Data Streams with Concept Drifts[C]//Proc of 2015 Brazilian conference on intelligent systems (BRACIS). Piscataway, NJ: IEEE, 55–61

  24. Zhang X, Wang M, Zhang Y et al (2016) Classifier ensemble algorithm for data stream with attribute uncertainty[J]. J Comput Theor Nanosci 13(10):7519–7525

    Article  Google Scholar 

  25. Lei B, Xiao H, Guo Y (2017) Two kinds of targets on-line classification based on incremental SVDD[C]//proc of 2017 IEEE 3rd information technology and mechatronics engineering conference, ITOEC 2017. Piscataway, NJ: IEEE, 1193–1196

  26. Zhou B, Wang T, Luo M et al (2017) An online tracking method via improved cost-sensitive adaboost [C]//proc of 2017 8th international conference on intelligent control and information processing, ICICIP 2017. Piscataway, NJ: IEEE, 49–54

  27. Zhang W, Xu A, Dianfa P et al (2017) An improved kernel-based incremental extreme learning machine with fixed budget for nonstationary time series prediction[J]. Neural Comput Appl 31(3):637–652

    Article  Google Scholar 

  28. Huang S, Wang B, Qiu J et al (2016) Parallel ensemble of online sequential extreme learning machine based on MapReduce [J]. Neurocomputing 174:352–367

    Article  Google Scholar 

  29. Rebuffi SA, Kolesnikov A, Sperl G et al (2017) iCaRL: Incremental Classifier and Representation Learning[C]//Proc of 2017 IEEE conference on computer vision and pattern recognition (CVPR). New York, NY: IEEE, 5533–5542

  30. Desai S, Roy S, Patel B et al (2016) Very fast decision tree (VFDT) algorithm on Hadoop[C]//proc of 2016 international conference on computing communication control and automation (ICCUBEA). New York, NY: IEEE, 1–7

  31. Das S, Dahiya S, Bharadwaj A (2014) An online software for decision tree classification and visualization using C4.5 algorithm (ODTC)[C]//Proc of International Conference on Computing for Sustainable Global Development. New York, NY: IEEE, 962–965

  32. Ding S, Mirza B, Lin Z et al (2017) Kernel based online learning for imbalance multiclass classification[J]. Neurocomputing 277:139–148

    Article  Google Scholar 

  33. Pocock A, Yiapanis P, Singer J et al (2010) Online non-stationary boosting[C]//proc of 9th international workshop on multiple classifier systems, MCS 2010. Germany: Springer-Verlag, 205–214

Download references

Acknowledgements

This work is supported by the National Natural Science Foundation of China (No. 61702229), the Project of Natural Science Foundation of Jiangsu Province of China (No. BK20150531), the Industry and School and Research Institution Project of Jiangsu province (No. BY2021075), the Industry University Cooperation Collaborative Education Project of Ministry of Education of China (No. 201902128024), the Key Higher Education Reform Research Project of Jiangsu University (NO.2021JGZD022), and the National Statistical Science Research Project of China (No. 2016LY17).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yan Shen.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shen, Y., Jing, L., Gao, T. et al. A multiple classifiers time-serial ensemble pruning algorithm based on the mechanism of forward supplement. Appl Intell 53, 5620–5634 (2023). https://doi.org/10.1007/s10489-022-03855-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-03855-z

Keywords

Navigation