Skip to main content
Log in

An exploratory study of mono and multi-objective metaheuristics to ensemble of classifiers

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

This paper performs an exploratory study of the use of metaheuristic optimization techniques to select important parameters (features and members) in the design of ensemble of classifiers. In order to do this, an empirical investigation, using 10 different optimization techniques applied to 23 classification problems, will be performed. Furthermore, we will analyze the performance of both mono and multi-objective versions of these techniques, using all different combinations of three objectives, classification error as well as two important diversity measures to ensembles, which are good and bad diversity measures. Additionally, the optimization techniques will also have to select members for heterogeneous ensembles, using k-NN, Decision Tree and Naive Bayes as individual classifiers and they are all combined using the majority vote technique. The main aim of this study is to define which optimization techniques obtained the best results in the context of mono and multi-objective as well as to provide a comparison with classical ensemble techniques, such as bagging, boosting and random forest. Our findings indicated that three optimization techniques, Memetic, SA and PSO, provided better performance than the other optimization techniques as well as traditional ensemble generator (bagging, boosting and random forest).

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  1. Fernández-Delgado M, Cernadas E, Barro S, Amorim D (2014) Do we need hundreds of classifiers to solve real world classification problems? J Mach Learn Res 15:3133–3181

    MathSciNet  MATH  Google Scholar 

  2. Wolpert DH, Macready WG (1997) No free lunch theorems for optimization. IEEE Trans Evol Comput 1-1:67–82

    Article  Google Scholar 

  3. Kuncheva LI (2004) Combining pattern classifiers: Methods and Algorithms. Wiley

  4. Kuncheva LI, Whitaker CJ (2003) Measures of diversity in classifier ensembles. Mach Learn 51:181–207

    Article  MATH  Google Scholar 

  5. Feitosa Neto A, Canuto A, Goldbarg E, Goldbarg M (2011) Optimization Techniques for the Selection of Members and Attributes in Ensemble System. IEEE Proceedings of Congress on Evolutionary Computation (CEC)

  6. Feitosa Neto A, Canuto A, Dantas C (2016) Multiobjective optimization techniques for selecting important metrics in the design of ensemble systems. Computational Intelligence Journal. doi:10.1111/coin.12090

  7. Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30

    MathSciNet  MATH  Google Scholar 

  8. Asunción A, Newman DJ (2007) UCI Machine Learning Repository, University of California at Irvine, http://ics.uci.edu/~mlearn/MLRepository.html

  9. Witten IH, Frank E (2005) Data Mining - Pratical Learning Tools and Techniques, 2nd edn. Morgan Kaufmann

  10. Brown G, Kuncheva L (2010) “good” and “bad” diversity in majority vote ensembles. In: El Gayar N, Kittler J, Roli F (eds) Multiple Classifier Systems, ser. Lecture Notes in Computer Science, vol 5997. Springer, Berlin Heidelberg, pp 124–133

    Google Scholar 

  11. Glover F (1986) Future paths for integer programming and links to artificial intelligence. Comput Oper Res 5:553–549

    MathSciNet  MATH  Google Scholar 

  12. Gendreau M, Potvin J (2010) Handbook of Metaheuristics, 2nd edn. Springer

  13. Goldberg DE (1989) Genetic algorithms in search, optimization and machine learning. Addison-Wesley, Berkeley

    MATH  Google Scholar 

  14. Deb K, Pratap A, Agarwal S, Meyarivan T (2002) A fast and elitist multiobjective genetic algorithm: Nsga-ii, vol 6

  15. Kirkpatrick S, Gellat DC, Vecchi MP (1983) Optimization by simulated annealing. Science 220:671–680

    Article  MathSciNet  MATH  Google Scholar 

  16. Dorigo M (1992) Optimization, Learning and Natural Algorithms. Phd thesis, Dipartimento di Elettronica, Politecnico di Milano, Italy, p 140

  17. Dorigo M (1997) Comput Oper Res 24:10971100

    Google Scholar 

  18. Goldbarg EFG, Goldbarg MC, de Souza GR (2006) Particle Swarm Optimization Algorithm for the Traveling Salesman Problem. In: Gottlieb J, Raidl GR (eds) EvoCOP 2006, LNCS 3906. Springer, Berlin, pp 99–110

    Google Scholar 

  19. Glover F (1996) Tabu Search and adaptive memory programming - advances, applications and challenges. In: Barr R., Helgason R., Kennington J. (eds) Interfaces in Computer Sciences and Operations Research. Kluwer Academic Publishers, pp 1–75

  20. Feo TA, Resende MGC (1995) Greedy randomized adaptive search procedures. J Glob Optim 6:109–133

    Article  MathSciNet  MATH  Google Scholar 

  21. Lee KY, El-Sharkawi MA (2008) Modern heuristic optimization techniques - theory and applications to power systems. Wiley-Interscience

  22. Santana LE, Canuto AM (2014) Filter-based optimization techniques for selection of feature subsets in ensemble systems. Expert Systems with Applications 41(4 Part 2):1622–1631

    Article  Google Scholar 

  23. Khan SA, Nazir M, Riaz N (2013) Optimized features selection for gender classification using optimization algorithms. Turk J Electr Eng Comput Sci 21:1479–1494

    Article  Google Scholar 

  24. Wang L, Ni H, Yang R, Pappu V, Fenn MB, Pardalos PM (2014) Feature selection based on meta-heuristics for biomedicine. Optimization Methods and Software 29(4):703–719. doi:10.1080/10556788.2013.834900

    Article  MathSciNet  MATH  Google Scholar 

  25. Sultan NQ, Shamsuddin SM, Hashim SZM, Darus M, Al-Shammari E (2013) Memetic multiobjective particle swarm optimization-based radial basis function network for classificationproblems. Inf Sci 239:165–190

    Article  Google Scholar 

  26. Oh D-Y, Gray JB (2013) GA-ensemble: a genetic algorithm for robust ensembles. Comput Stat 28:2333–2347

    Article  MathSciNet  MATH  Google Scholar 

  27. Liu Z, Dai Q, Liu N (2014) Ensemble selection by GRASP. Appl Intell 41:128–144

    Article  Google Scholar 

  28. Palanisamy S, Kanmani S (2012) Classifier Ensemble Design using Artificial Bee Colony based Feature Selection. IJCSI Int J Comput Sci Issues 9(3). No 2

  29. Zhang T, Dai Q, Ma Z (2015) Extreme learning machines ensemble selection with GRASP. Appl Intell 43:439–459

    Article  Google Scholar 

  30. Boussaïd I., Lepagnot J, Siarry P (2013) A survey on optimization metaheuristics. Inf Sci 237:82–117

    Article  MathSciNet  MATH  Google Scholar 

  31. Monti S, Tamayo P, Mesirov J, Golub T (2003) Consensus clustering – A resampling-based method for class discovery and visualization of gene expression microarray data. Machine Learning: Functional Genomics Special Issue, 91–118

  32. Chen Y, Wong M-L, Li H (2014) Applying Ant Colony Optimization to configuring stacking ensembles for data mining. Expert Systems with Applications 41:2688–2702

    Article  Google Scholar 

  33. Chen Y, Zhao Y (2008) A novel ensemble of classifiers for microarray data classification. Appl Soft Comput 8:1664–1669

    Article  Google Scholar 

  34. Thiele L, Zitzler E, Knowles J (2006) A tutorial on the performance assessment of stochastic multiobjective optimizers, Computer Engineering and Networks Laboratory (TIK), ETH Zurich, Tech. Rep. TIK Report 214

  35. Zitzler E, Thiele L, Laumanns M, Fonseca C, da Fonseca V (2003) Performance assessment of multiobjective optimizers: an analysis and review, vol 7

  36. Zitzler E, Thiele L (1999) Multiobjective evolutionary algorithms: a comparative case study and the strength pareto approach, vol 3

  37. Mao S, Jiao LC, Xiong L, Gou S (2011) Greedy optimization classifiers ensemble based on diversity. Patter Recogn 44(6):1245–1261

    Article  MATH  Google Scholar 

  38. Lysiak R, Kurzynski M, Woloszynski T (2014) Optimal selection of ensemble classifiers using measures of competence and diversity of base classifiers. Neurocomputing 126:29–35

    Article  Google Scholar 

  39. Slubana B, Lavrača N (2015) Relating ensemble diversity and performance: A study in class noise detection. Neurocomputing 160:120–131

    Article  Google Scholar 

Download references

Acknowledgments

This work has the financial support of CNPq (Brazilian Research Councils).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anne M. P. Canuto.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Feitosa Neto, A.A., Canuto, A.M.P. An exploratory study of mono and multi-objective metaheuristics to ensemble of classifiers. Appl Intell 48, 416–431 (2018). https://doi.org/10.1007/s10489-017-0982-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-017-0982-4

Keywords

Navigation