Skip to main content

On Selection of Optimal Classifiers

  • Conference paper
  • First Online:
Artificial Intelligence XXXVI (SGAI 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11927))

Abstract

The current advances of computational power and storage allow more models to be created and stored from significant data resources. This progress opens the opportunity to re-cycle and re-use such models in similar exercises. The evaluation of the machine learning algorithms and selection of an appropriate classifier from an existing collection of classifiers are still challenging tasks. In most cases, the decision of selecting the classifier is left to the user. When the selection is not performed accurately, the outcomes can have unexpected performance results. Classification algorithms aim to optimise some of the distinct objectives such as minimising misclassification error, maximising the accuracy, or maximising the model quality. The right choice for each of these objectives is critical to the quality of the classifier selected. This work aims to study the use of a multi-objective method that can be undertaken to find a set of suitable classifiers for a problem at hand. In this study, we applied seven classifiers on mental health data sets for classifier selection in terms of correctness and reliability. The experimental results suggest that this approach is useful in finding the best trade-off among the objectives of selecting a suitable classifier framework.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Kantardzic, M., Data mining: concepts, models, methods, and algorithms (2011)

    Google Scholar 

  2. Witten, I.H., Frank, E., Hall, M.A.: Data mining. Data Min. 36(5), 51–52 (2011)

    Google Scholar 

  3. Berrer, H., Paterson, I., Keller, J.: Evaluation of machine-learning algorithm ranking advisors. In: Proceedings {PKDD2000} Work, Data Mining, Decision Support. Meta-Learning {ILP} Forum Practice Problem Present. Prospection Solution, pp. 1–13 (2000)

    Google Scholar 

  4. Ali, R., Lee, S., Chung, T.C.: Accurate multi-criteria decision-making methodology for recommending machine learning algorithm. Expert Syst. Appl. 71, 257–278 (2017)

    Article  Google Scholar 

  5. Shenfield, A., Rostami, S.: A multi objective approach to evolving artificial neural networks for coronary heart disease classification. IEEE Conf. Comput. Intell. Bioinforma. Comput. Biol. CIBCB 2015, 2015 (2015)

    Google Scholar 

  6. Datta, S., Das, S.: Multiobjective support vector machines: handling class imbalance with pareto optimality. IEEE Trans. Neural Networks Learn. Syst. 30(5), 1602–1608 (2018)

    Article  MathSciNet  Google Scholar 

  7. Burger, S.: Introduction to machine learning with R : rigorous mathematical analysis (2018)

    Google Scholar 

  8. Pangilinan, J.M., Janssens, G.K.: Pareto-optimality of oblique decision trees from evolutionary algorithms, vol. 51, pp. 301–311 (2011)

    Google Scholar 

  9. Czajkowski, M., Kretowski, M.: A multi-objective evolutionary approach to pareto-optimal model trees. Soft. Comput. 23(5), 1423–1437 (2019)

    Article  Google Scholar 

  10. Gunantara, N.: A review of multi-objective optimization: methods and its applications. Cogent Eng. 5(1), 1–16 (2018)

    Article  Google Scholar 

  11. Muaafa, M.: Multi-Criteria Decision-Making Frameworks for Surveillance and Logistics Applications (2015). https://www.researchgate.net/profile/Mohammed_Muaafa/publication/305754592_Multi-Criteria_Decision-Making_Frameworks_for_Surveillance_and_Logistics_Applications/links/579f34b708ae5d5e1e17ce12/Multi-Criteria-Decision-Making-Frameworks-for-Surveillance-and-Logistics-Applications.pdf. Accessed 23 Oct 2019

  12. Moffaert, K.V., Nowe, A.: Multi-objective reinforcement learning using sets of pareto dominating policies. J. Mach. Learn. Res. 15(1), 3483–3512 (2014)

    MathSciNet  MATH  Google Scholar 

  13. Knowles, J.: Multiobjective Optimization, pp. 193–262, May (2010)

    Google Scholar 

  14. Breiman, L.: Random Forests (2001)

    Google Scholar 

  15. Mental Health in Tech Survey|Kaggle. https://www.kaggle.com/osmi/mental-health-in-tech-survey. Last accessed 05 Jul 2019

  16. Rogers, S., Girolami, M.: A first course in machine learning (2011)

    Google Scholar 

  17. Hewson, P.J.: Multivariate statistics with R, pp. 1–189 (2009)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Omesaad Rado .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Rado, O., Neagu, D. (2019). On Selection of Optimal Classifiers. In: Bramer, M., Petridis, M. (eds) Artificial Intelligence XXXVI. SGAI 2019. Lecture Notes in Computer Science(), vol 11927. Springer, Cham. https://doi.org/10.1007/978-3-030-34885-4_42

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-34885-4_42

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-34884-7

  • Online ISBN: 978-3-030-34885-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics