Skip to main content

Advertisement

Log in

Stochastic ensemble pruning method via simulated quenching walking

  • Original Article
  • Published:
International Journal of Machine Learning and Cybernetics Aims and scope Submit manuscript

Abstract

Inspired by an upward stochastic walking idea, a new ensemble pruning method called simulated quenching walking (SQWALKING) is developed in this paper. The rationale behind this method is to give values to stochastic movements as well as to accept unvalued solutions during the investigation of search spaces. SQWALKING incorporates simulated quenching and forward selection methods to choose the models through the ensemble using probabilistic steps. Two versions of SQWALKING are introduced based on two different evaluation measures; SQWALKINGA that is based on an accuracy measure and SQWALKINGH that is based on a human-like foresight measure. The main objective is to construct a proper architecture of ensemble pruning, which is independent of ensemble construction and combination phases. Extensive comparisons between the proposed method and competitors in terms of heterogeneous and homogeneous ensembles are performed using ten datasets. The comparisons on the heterogeneous ensemble show that SQWALKINGH and SQWALKINGA can lead respectively to 5.13% and 4.22% average accuracy improvement. One reason for these promising results is the pruning phase that takes additional time to find the best models compared to rivals. Finally, the proposed SQWALKINGs are also evaluated on a real-world dataset.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  1. Kuncheva LI (2004) Combining pattern classifiers, methods and algorithms. Wiley, Hoboken

    Book  MATH  Google Scholar 

  2. Tan PN, Steinbach M, Kumar V (2006) Introduction to data mining. Addison-Wesley Longman Publishing Co., Inc., Boston

  3. Dietterich TG (2000) Ensemble methods in machine learning. Springer, Berlin

    Book  Google Scholar 

  4. Prodromidis A, Chan P (2000) Meta-learning in distributed data mining systems: issues and approaches. In: Advances of distributed data mining. AAAI Press, Palo Alto

    Google Scholar 

  5. Caruana R et al (2004) Ensemble selection from libraries of models. In: Proceedings of the 21st international conference on machine learning, ACM: Banff, Canada. p 137–144

  6. Margineantu D, Dietterich T (1997) Pruning adaptive boosting. In: Proceedings of the 14th international conference on machine learning. Morgan Kaufmann Publishers Inc. p 211–218

  7. Zhou ZH, Wu J, Tang W (2002) Ensemble neural networks: many could be better than all. Artif Intell 137(1–2):239–263

    Article  MATH  Google Scholar 

  8. Ekbal A, Saha S (2011) A multiobjective simulated annealing approach for classifier ensemble: named entity recognition in Indian languages as case studies. Expert Syst Appl 38(12):14760–14772

    Article  Google Scholar 

  9. Ekbal A, Saha S (2011) Weighted vote-based classifier ensemble for named entity recognition: a genetic algorithm-based approach. ACM Trans Asian Lang Inf Process 10(2):1–37

    Article  Google Scholar 

  10. Ekbal A, Saha S (2013) Simulated annealing based classifier ensemble techniques: application to part of speech tagging. Inf Fusion 14(3):288–300

    Article  Google Scholar 

  11. Saha S, Ekbal A (2013) Combining multiple classifiers using vote based classifier ensemble technique for named entity recognition. Data Knowl Eng 85:15–39

    Article  Google Scholar 

  12. Deb K (2001) Multi-objective optimization using evolutionary algorithms. Wiley, Hoboken, p 518

    MATH  Google Scholar 

  13. Banfield RE et al (2005) Ensemble diversity measures and their application to thinning. Inf Fusion 6(1):49–62

    Article  Google Scholar 

  14. Partalas I, Tsoumakas G, Vlahavas I (2010) An ensemble uncertainty aware measure for directed hill climbing ensemble pruning. Mach Learn 81(3):257–282

    Article  MathSciNet  Google Scholar 

  15. Taghavi ZS, Sajedi H (2013) Human-inspired ensemble pruning using hill climbing algorithm. In: The 5th Robocup Iranopen International Symposium and the 3rd Joint Conference of Robotices and AI. IEEE: Qazvin-Iran. p 1–7

  16. Taghavi ZS, Sajedi H (2014) Ensemble selection using simulated annealing walking. In: The proceedings of the international conference on advances in computing, electronics and electrical technology. Institute of research engineers and doctors: Kuala Lumpur, Malaysia

  17. Taghavi ZS, Sajedi H (2015) Ensemble pruning based on oblivious Chained Tabu searches. Int J Hybrid Intell Syst 12(3):131–143

    Article  Google Scholar 

  18. Zhou ZH, Tang W (2003) Selective ensemble of decision trees. Springer, Berlin

    Book  MATH  Google Scholar 

  19. Tamon C, Xiang J (2000) On the boosting pruning problem. In: The 11th European conference on machine learning, ECML. Springer, Berlin, p 404–412

  20. Ingber L (1993) Simulated annealing: Practice versus theory. Math Comput Model 18(11):29–57

    Article  MathSciNet  MATH  Google Scholar 

  21. Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by simulated annealing. Science 220(4598):671–680

    Article  MathSciNet  MATH  Google Scholar 

  22. Breiman L (1996) Bagging predictors. Mach Learn 24(2):123–140

    MATH  Google Scholar 

  23. Schapire RE (1990) The strength of weak learnability. Mach Learn 5(2):197–227

    Google Scholar 

  24. Freund Y, Schapire RE (1997) A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci 55(1):119–139

    Article  MathSciNet  MATH  Google Scholar 

  25. Parmanto B, Munro P, Doyle H (1996) Improving committee diagnosis with resampling techniques. Adv Neural Inf Process Syst 8:882–888

    Google Scholar 

  26. Breiman L (2001) Random forests. Mach Learn 45(1):5–32

    Article  MATH  Google Scholar 

  27. Zhou ZH (2012) Ensemble methods: foundations and algorithms. CRC Press, Boca Raton

    Book  Google Scholar 

  28. Xia X, Lin T, Chen Z (2018) Maximum relevancy maximum complementary based ordered aggregation for ensemble pruning. Appl Intell 48:2568–2579

    Article  Google Scholar 

  29. Guo H et al (2018) Margin and diversity based ordering ensemble pruning. Neurocomputing 275:237–246

    Article  Google Scholar 

  30. Li D et al (2018) RTCRelief-F: an effective clustering and ordering-based ensemble pruning algorithm for facial expression recognition. Knowl Inf Syst. https://doi.org/10.1007/s10115-018-1176-z

    Article  Google Scholar 

  31. Onan A, Korukoğlu S, Bulut H (2017) A hybrid ensemble pruning approach based on consensus clustering and multi-objective evolutionary algorithm for sentiment classification. Inf Process Manag 53(4):814–833

    Article  Google Scholar 

  32. Lin C et al (2014) LibD3C: ensemble classifiers with a clustering and dynamic selection strategy. Neurocomputing 123:424–435

    Article  Google Scholar 

  33. Zhang H, Cao L (2014) A spectral clustering based ensemble pruning approach. Neurocomputing 139:289–297

    Article  Google Scholar 

  34. Sheen S, Anitha R, Sirish P (2013) Malware detection by pruning of parallel ensembles using harmony search. Pattern Recognit Lett Innov Knowl-based Tech 34(14):1679–1686

    Article  Google Scholar 

  35. Narassiguin A, Elghazel H, Aussem A (2017) Dynamic ensemble selection with probabilistic classifier chains. In: Machine learning and knowledge discovery in databases, p 169–186

  36. Mozaffari A et al (2017) A hierarchical selective ensemble randomized neural network hybridized with heuristic feature selection for estimation of sea-ice thickness. Appl Intell 46(1):16–33

    Article  Google Scholar 

  37. Ye R, Dai Q (2018) A novel greedy randomized dynamic ensemble selection algorithm. Neural Process Lett 47(2):565–599

    Google Scholar 

  38. Bandyopadhyay S et al (2008) A simulated annealing based multi-objective optimization algorithm: AMOSA. IEEE Trans Evol Comput 12(3):269–283

    Article  Google Scholar 

  39. Goldberg DE (1989) Genetic algorithms in search, optimization and machine learning. Addison-Wesley Longman Publishing Co., Inc., Boston, p 372

    Google Scholar 

  40. Deb K et al. (2002) A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 6(2): 181–197

  41. Wang X et al (2015) A study on relationship between generalization abilities and fuzziness of base classifiers in ensemble learning. IEEE Trans Fuzzy Syst 23(5):1638–1654

    Article  Google Scholar 

  42. Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: Proceedings of the 13th International conference on international conference on machine learning. Morgan Kaufmann Publishers Inc., Bari, Italy. p 148–156

  43. Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

  44. Wei L et al (2014) Improved and promising identification of human micrornas by incorporating a high-quality negative set. IEEE/ACM Trans Comput Biol Bioinform 11(1):192–201

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to acknowledge the efforts and the consideration of the editor and all reviewers for their valuable comments and suggestions to improve the quality of the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Amir Hossein Niknamfar.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Taghavi, Z.S., Niaki, S.T.A. & Niknamfar, A.H. Stochastic ensemble pruning method via simulated quenching walking. Int. J. Mach. Learn. & Cyber. 10, 1875–1892 (2019). https://doi.org/10.1007/s13042-018-00912-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13042-018-00912-3

Keywords

Navigation