Abstract
Parameter setting will have a great impact on overall behavior of a machine learning model in terms of training time, infrastructure resource requirements, model convergence, and model accuracy. While training machine learning models, it is very difficult to choose optimum values for various parameters to create the final model architecture. There are two types of parameters in machine learning model, one is referred as model parameters that are estimated by fitting the given data to the model. And the other is referred as model hyperparameters, these parameters are used to control the learning process. Model parameters are determined by machine ideally by exploration and automatically picks the optimum value; for example, the weights given to a neural network continuously update throughout each iteration until an optimal value is not reached. The method of hyperparameter tuning aims to determine the optimal combination of hyperparameters that will enable the model to function optimally. Setting the optimal mix of hyperparameters is the only method to maximize model performance. However, the designer is responsible for setting the hyperparameters that define the model architecture, such as the value of k in a kNN model, and the process of finding the optimum hyperparameter is referred to as hyperparameter tuning. Currently, this is handled in a variety of methods, including random searching of a specific solution space, sequential searching of the solution space using grids, and so on. In this article, comparative analysis of these methods to the genetic algorithm methodology for hyperparameter tuning is tested.








Similar content being viewed by others
Data Availability
Not applicable.
References
Katoch S, Chauhan SS, Kumar V. A review on genetic algorithm: past, present, and future. Multimed Tools Appl. 2021;80:8091–126. https://doi.org/10.1007/s11042-020-10139-6.
Sarker IH. Machine learning: algorithms, real-world applications and research directions. SN Comput Sci. 2021;2(3):160. https://doi.org/10.1007/s42979-021-00592-x. (Epub 2021 Mar 22. PMID: 33778771; PMCID: PMC7983091).
Hammad I, El-Sankary K, Gu J. A comparative study on machine learning algorithms for the control of a wall following robot. In: 2019 IEEE international conference on robotics and biomimetics (ROBIO). IEEE; 2019.
Kumar Y, Kaur K, Singh G. Machine learning aspects and its applications towards different research areas. In: 2020 international conference on computation, automation and knowledge management (ICCAKM); 2020.
Olawade OE, Onashoga SA, Arogundade O. Comparative analysis of machine learning techniques in health system. In: 2020 international conference in mathematics, computer engineering and computer science (ICMCECS); 2020.
Wang Z, Agung M, Egawa R, Suda R, Takizawa H. Automatic hyperparameter tuning of machine learning models under time constraints. In: 2018 IEEE international conference on big data (Big Data); 2018.
Diaz GI, Fokoue-Nkoutche A, Nannicini G, Samulowitz H. An effective algorithm for hyperparameter optimization of neural networks. IBM J Res Dev. 2017;61(4/5):91–911.
Shekar BH, Dagnew G. Grid Search-Based Hyperparameter Tuning and Classification of Microarray Cancer Data. In: 2019 Second International Conference on Advanced Computational and Communication Paradigms (ICACCP); 2019
Sun S, Cao Z, Zhu H, Zhao J. A survey of optimization methods from a machine learning perspective. IEEE Trans Cybern. 2020;50(8):3668–81. https://doi.org/10.1109/TCYB.2019.2950779. (Epub 2019 Nov 18 PMID: 31751262).
Goodfellow I, Bengio Y, Courville A. Deep learning. Cambridge: MIT Press; 2016.
https://towardsdatascience.com/tpot-automated-machine-learning-in-pythone56800e69c11
https://www.kdnuggets.com/2016/05/tpot-python-automating-data-science.html/2
Al-Oqaily AT, Shakah G. Solving non-linear optimization problems using parallel genetic algorithm. In: 2018 8th international conference on computer science and information technology (CSIT); 2018, pp 103–106, https://doi.org/10.1109/CSIT.2018.8486176
Mostaeen G, Svajlenko J, Roy B, Roy CK, Schneider KA. [Research Paper] On the use of machine learning techniques towards the design of cloud based automatic code clone validation tools. In: 2018 IEEE 18th international working conference on source code analysis and manipulation (SCAM), 2018, pp. 155–164, https://doi.org/10.1109/SCAM.2018.00025.
Automated Hyperparameter Tuning in: Hyperparameter Optimization: https://towardsdatascience.com/hyperparameters-optimization-526348bb8e2d
Girish C, Ferat S. A survey on feature selection methods. Comput Electr Eng. 2014;40(1):16–28. https://doi.org/10.1016/j.compeleceng.2013.11.024. (ISSN 0045–7906).
Manuel F-D, Eva C, Senen B, Dinani A. Do we need hundreds of classifiers to solve real world classification problems? J Mach Learn Res. 2014;15(90):3133–81. https://doi.org/10.5555/2627435.2697065.
Online content: Hyperparameter Optimization With Random Search and Grid Search - MachineLearningMastery.com accessed on 14 October 2022.
Philipp P, Anne-Laure B, Bernd B. Tunability: importance of hyperparameters of machine learning algorithms. J Mach Learn. 2021;20(1):1934–65 https://doi.org/10.1007/978-1-4614-6849-3 (ISSN:1532–4435).
Jia W, Xiu-Yun C, Hao Z, Li-Dong X, Hang L, Si-Hao D. Hyperparameter optimization for machine learning models based on bayesian optimization. J Electron Sci Technol. 2019;17(1):26–40. https://doi.org/10.11989/JEST.1674-862X.80904120. (ISSN 1674–862X).
Philipp P, Anne-Laure B, Bernd B. Tunability: importance of hyperparameters of machine learning algorithms. J Mach Learn. 2021;20(1):1934–65 (ISSN:1532–4435).
Funding
No funding received for this research.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare that they have no conflict of interest.
Ethical Approval
This article does not contain any studies with human participants or animals performed by any of the authors.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
This article is part of the topical collection “Advances in Computational Intelligence for Artificial Intelligence, Machine Learning, Internet of Things and Data Analytics” guest edited by S. Meenakshi Sundaram, Young Lee and Gururaj K S.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Shanthi, D.L., Chethan, N. Genetic Algorithm Based Hyper-Parameter Tuning to Improve the Performance of Machine Learning Models. SN COMPUT. SCI. 4, 119 (2023). https://doi.org/10.1007/s42979-022-01537-8
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s42979-022-01537-8