Abstract
This paper describes the development of an effective and efficient Hierarchical and Parallel Branch-and-Bound Ensemble Selection (H&PB&BEnS) algorithm. Using the proposed H&PB&BEnS, ensemble selection is accomplished in a divisional, parallel, and hierarchical way. H&PB&BEnS uses the superior performance of the Branch-and-Bound (B&B) algorithm in relation to small-scale combinational optimization problems, whilst also managing to avoid “the curse of dimensionality” that can result from the direct application of B&B to ensemble selection problems. The B&B algorithm is used to select each partitioned subensemble, which enhances the predictive accuracy of each pruned subsolution, and then the working mechanism of H&PB&BEnS improves the diversity of the ensemble selection results. H&PB&BEnS realizes layer-wise refinement of the selected ensemble solutions, which enables the classification performance of the selected ensembles to be improved in a layer-by-layer manner. Empirical investigations are conducted using five benchmark classification datasets, and the results verify the effectiveness and efficiency of the proposed H&PB&BEnS algorithm.
Similar content being viewed by others
References
Trawinski K, Cordón O, Quirin A, Sánchez L (2013) Multiobjective genetic classifier selection for random oracles fuzzy rule-based classifier ensembles: how beneficial is the additional diversity?. Knowl-Based Syst 54:3–21
Partalas I, Tsoumakas G, Vlahavas I (2010) An ensemble uncertainty aware measure for directed hill climbing ensemble pruning. Mach Learn 81:257–282
Margineantu D, Dietterich T (1997) Pruning adaptive boosting. In: Proceedings of the 14th international conference on machine learning
Prodromidis AL, Stolfo SJ (2001) Cost complexity-based pruning of ensemble classifiers. Knowl Inf Syst 3:449–469
Dai Q (2013) A competitive ensemble pruning approach based on cross-validation technique. Knowl-Based Syst 37:394–414
Dai Q, Liu Z (2013) ModEnPBT: a modified backtracking ensemble pruning algorithm. Appl Soft Comput 13:4292–4302
Fan W, Chu F, Wang H, Yu PS (2002) Pruning and dynamic scheduling of cost-sensitive ensembles. In: Eighteenth national conference on artificial intelligence, American association for artificial intelligence
Banfield RE, Hall LO, Bowyer KW, Kegelmeyer WP (2005) Ensemble diversity measures and their application to thinning. Information Fusion 6:49–62
Caruana R, Niculescu-Mizil A, Crew G, Ksikes A (2004) Ensemble selection from libraries of models. In: Proceedings of the 21st international conference on machine learning
Giacinto G, Roli F, Fumera G (2000) Design of effective multiple classifier systems by clustering of classifiers
Martinez-Munoz G, Suarez A (2004) Aggregation ordering in bagging
Martinez-Munoz G, Suarez A (2006) Pruning in ordered bagging ensembles
Partalas I, Tsoumakas G, Vlahavas I (2008) Focused ensemble selection: a diversity-based method for greedy ensemble selection, vol 178. IOS Press, Patras, Greece: Amsterdam
Tsoumakas G, Angelis L, Vlahavas I (2005) Selective fusion of heterogeneous classifiers. Intelligent Data Analysis 9:511–525
Dai Q (2013) An efficient ensemble pruning algorithm using One-Path and Two-Trips searching approach. Knowl-Based Syst 51:85–92
Dai Q (2013) A novel ensemble pruning algorithm based on randomized greedy selective strategy and ballot. Neurocomputing 122:258–265
Dai Q, Li M (2015) Introducing randomness into greedy ensemble pruning algorithms. Appl Intell 42:406–429
Dai Q, Zhang T, Liu N (2015) A new reverse reduce-error ensemble pruning algorithm. Appl Soft Comput 28:237–249
Liu Z, Dai Q, Liu N (2014) Ensemble selection by GRASP. Appl Intell 41:128–144
Partridge D, Yates W (1996) Engineering multiversion neural-net systems. Neural Comput 8:869–893
Anifowose FA, Labadin J, Abdulraheem A (2015) Ensemble model of non-linear feature selection-based extreme learning machine for improved natural gas reservoir characterization. J Nat Gas Sci Eng
Kokkinos Y, Margaritis KG (2015) Confidence ratio affinity propagation in ensemble selection of neural network classifiers for distributed privacy-preserving data mining. Neurocomputing 150:513–528
JunTan C, Lim CP, Cheah YN (2014) A multi-objective evolutionary algorithm-based ensemble optimizer for feature selection and classification with neural network models. Neurocomputing 125:217–228
Lysiak R, Kurzynski M, Woloszynski T (2014) Optimal selection of ensemble classifiers using measures of competence and diversity of base classifiers. Neurocomputing 126:29–35
Garey MR, Johnson DS (1979) Computers and intractability: a guide to the theory of NP-completeness. W.H. Freeman & Co., New York
Bendjoudi A, Melab N, Talbi E-G (2012) Hierarchical branch and bound algorithm for computational grids. Futur Gener Comput Syst 28:1168–1176
Martinez-Munoz G, Hernandez-Lobato D, Suarez A (2009) An analysis of ensemble pruning techniques based on ordered aggregation. IEEE Trans Pattern Anal Mach Intell 31:245–259
Partalas I, Tsoumakas G, Vlahavas I (2009) Pruning an ensemble of classifiers via reinforcement learning. Neurocomputing 72:1900–1909
Martinez-Munoz G, Suarez A (2007) Using boosting to prune bagging ensembles. Pattern Recogn Lett 28:156–165
Wang XD (2001) Computer algorithms design and analysis. Publishing House of Electronics Industry, Beijing
http://www.ics.uci.edu/~mlearn/MLRepository.html or ftp.ics.uci.edu:pub/machine-learning-databases
Dai Q, Liu NZ (2011) The build of n-bits binary coding ICBP ensemble system. Neurocomputing 74:3509–3519
Dai Q, Chen SC, Zhang BZ (2003) Improved CBP neural network model with applications in time series prediction. Neural Process Lett 18:197–211
Huang G-B, Zhu Q-Y, Siew C-K (2006) Extreme learning machine: theory and applications. Neurocomputing 70:489–501
Acknowledgments
This work is supported by the National Natural Science Foundation of China under Grant no. 61473150.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Disclosure of potential conflict of interest
ᅟ
Conflict of interests
The authors declare that they have no conflict of interest.
Research involving Human Participants and/or Animals
All procedures performed in studies involving human participants were carried out in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
For this type of study, formal consent was not required.
All applicable international, national, and/or institutional guidelines for the care and use of animals were followed.
Informed consent
Informed consent was obtained from all individual participants involved in the study.
Additional informed consent was obtained from all individual participants for whom identifying information is included in this article.
Rights and permissions
About this article
Cite this article
Dai, Q., Yao, C. A hierarchical and parallel branch-and-bound ensemble selection algorithm. Appl Intell 46, 45–61 (2017). https://doi.org/10.1007/s10489-016-0817-8
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-016-0817-8