Abstract
Inspired by an upward stochastic walking idea, a new ensemble pruning method called simulated quenching walking (SQWALKING) is developed in this paper. The rationale behind this method is to give values to stochastic movements as well as to accept unvalued solutions during the investigation of search spaces. SQWALKING incorporates simulated quenching and forward selection methods to choose the models through the ensemble using probabilistic steps. Two versions of SQWALKING are introduced based on two different evaluation measures; SQWALKINGA that is based on an accuracy measure and SQWALKINGH that is based on a human-like foresight measure. The main objective is to construct a proper architecture of ensemble pruning, which is independent of ensemble construction and combination phases. Extensive comparisons between the proposed method and competitors in terms of heterogeneous and homogeneous ensembles are performed using ten datasets. The comparisons on the heterogeneous ensemble show that SQWALKINGH and SQWALKINGA can lead respectively to 5.13% and 4.22% average accuracy improvement. One reason for these promising results is the pruning phase that takes additional time to find the best models compared to rivals. Finally, the proposed SQWALKINGs are also evaluated on a real-world dataset.





Similar content being viewed by others
Explore related subjects
Discover the latest articles and news from researchers in related subjects, suggested using machine learning.References
Kuncheva LI (2004) Combining pattern classifiers, methods and algorithms. Wiley, Hoboken
Tan PN, Steinbach M, Kumar V (2006) Introduction to data mining. Addison-Wesley Longman Publishing Co., Inc., Boston
Dietterich TG (2000) Ensemble methods in machine learning. Springer, Berlin
Prodromidis A, Chan P (2000) Meta-learning in distributed data mining systems: issues and approaches. In: Advances of distributed data mining. AAAI Press, Palo Alto
Caruana R et al (2004) Ensemble selection from libraries of models. In: Proceedings of the 21st international conference on machine learning, ACM: Banff, Canada. p 137–144
Margineantu D, Dietterich T (1997) Pruning adaptive boosting. In: Proceedings of the 14th international conference on machine learning. Morgan Kaufmann Publishers Inc. p 211–218
Zhou ZH, Wu J, Tang W (2002) Ensemble neural networks: many could be better than all. Artif Intell 137(1–2):239–263
Ekbal A, Saha S (2011) A multiobjective simulated annealing approach for classifier ensemble: named entity recognition in Indian languages as case studies. Expert Syst Appl 38(12):14760–14772
Ekbal A, Saha S (2011) Weighted vote-based classifier ensemble for named entity recognition: a genetic algorithm-based approach. ACM Trans Asian Lang Inf Process 10(2):1–37
Ekbal A, Saha S (2013) Simulated annealing based classifier ensemble techniques: application to part of speech tagging. Inf Fusion 14(3):288–300
Saha S, Ekbal A (2013) Combining multiple classifiers using vote based classifier ensemble technique for named entity recognition. Data Knowl Eng 85:15–39
Deb K (2001) Multi-objective optimization using evolutionary algorithms. Wiley, Hoboken, p 518
Banfield RE et al (2005) Ensemble diversity measures and their application to thinning. Inf Fusion 6(1):49–62
Partalas I, Tsoumakas G, Vlahavas I (2010) An ensemble uncertainty aware measure for directed hill climbing ensemble pruning. Mach Learn 81(3):257–282
Taghavi ZS, Sajedi H (2013) Human-inspired ensemble pruning using hill climbing algorithm. In: The 5th Robocup Iranopen International Symposium and the 3rd Joint Conference of Robotices and AI. IEEE: Qazvin-Iran. p 1–7
Taghavi ZS, Sajedi H (2014) Ensemble selection using simulated annealing walking. In: The proceedings of the international conference on advances in computing, electronics and electrical technology. Institute of research engineers and doctors: Kuala Lumpur, Malaysia
Taghavi ZS, Sajedi H (2015) Ensemble pruning based on oblivious Chained Tabu searches. Int J Hybrid Intell Syst 12(3):131–143
Zhou ZH, Tang W (2003) Selective ensemble of decision trees. Springer, Berlin
Tamon C, Xiang J (2000) On the boosting pruning problem. In: The 11th European conference on machine learning, ECML. Springer, Berlin, p 404–412
Ingber L (1993) Simulated annealing: Practice versus theory. Math Comput Model 18(11):29–57
Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by simulated annealing. Science 220(4598):671–680
Breiman L (1996) Bagging predictors. Mach Learn 24(2):123–140
Schapire RE (1990) The strength of weak learnability. Mach Learn 5(2):197–227
Freund Y, Schapire RE (1997) A decision-theoretic generalization of on-line learning and an application to boosting. J Comput Syst Sci 55(1):119–139
Parmanto B, Munro P, Doyle H (1996) Improving committee diagnosis with resampling techniques. Adv Neural Inf Process Syst 8:882–888
Breiman L (2001) Random forests. Mach Learn 45(1):5–32
Zhou ZH (2012) Ensemble methods: foundations and algorithms. CRC Press, Boca Raton
Xia X, Lin T, Chen Z (2018) Maximum relevancy maximum complementary based ordered aggregation for ensemble pruning. Appl Intell 48:2568–2579
Guo H et al (2018) Margin and diversity based ordering ensemble pruning. Neurocomputing 275:237–246
Li D et al (2018) RTCRelief-F: an effective clustering and ordering-based ensemble pruning algorithm for facial expression recognition. Knowl Inf Syst. https://doi.org/10.1007/s10115-018-1176-z
Onan A, Korukoğlu S, Bulut H (2017) A hybrid ensemble pruning approach based on consensus clustering and multi-objective evolutionary algorithm for sentiment classification. Inf Process Manag 53(4):814–833
Lin C et al (2014) LibD3C: ensemble classifiers with a clustering and dynamic selection strategy. Neurocomputing 123:424–435
Zhang H, Cao L (2014) A spectral clustering based ensemble pruning approach. Neurocomputing 139:289–297
Sheen S, Anitha R, Sirish P (2013) Malware detection by pruning of parallel ensembles using harmony search. Pattern Recognit Lett Innov Knowl-based Tech 34(14):1679–1686
Narassiguin A, Elghazel H, Aussem A (2017) Dynamic ensemble selection with probabilistic classifier chains. In: Machine learning and knowledge discovery in databases, p 169–186
Mozaffari A et al (2017) A hierarchical selective ensemble randomized neural network hybridized with heuristic feature selection for estimation of sea-ice thickness. Appl Intell 46(1):16–33
Ye R, Dai Q (2018) A novel greedy randomized dynamic ensemble selection algorithm. Neural Process Lett 47(2):565–599
Bandyopadhyay S et al (2008) A simulated annealing based multi-objective optimization algorithm: AMOSA. IEEE Trans Evol Comput 12(3):269–283
Goldberg DE (1989) Genetic algorithms in search, optimization and machine learning. Addison-Wesley Longman Publishing Co., Inc., Boston, p 372
Deb K et al. (2002) A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans Evol Comput 6(2): 181–197
Wang X et al (2015) A study on relationship between generalization abilities and fuzziness of base classifiers in ensemble learning. IEEE Trans Fuzzy Syst 23(5):1638–1654
Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: Proceedings of the 13th International conference on international conference on machine learning. Morgan Kaufmann Publishers Inc., Bari, Italy. p 148–156
Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30
Wei L et al (2014) Improved and promising identification of human micrornas by incorporating a high-quality negative set. IEEE/ACM Trans Comput Biol Bioinform 11(1):192–201
Acknowledgements
The authors would like to acknowledge the efforts and the consideration of the editor and all reviewers for their valuable comments and suggestions to improve the quality of the paper.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Taghavi, Z.S., Niaki, S.T.A. & Niknamfar, A.H. Stochastic ensemble pruning method via simulated quenching walking. Int. J. Mach. Learn. & Cyber. 10, 1875–1892 (2019). https://doi.org/10.1007/s13042-018-00912-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s13042-018-00912-3