Skip to main content

A Multi-objective Hyperparameter Optimization for Machine Learning Using Genetic Algorithms: A Green AI Centric Approach

  • Conference paper
  • First Online:
Advances in Artificial Intelligence – IBERAMIA 2022 (IBERAMIA 2022)

Abstract

This work proposes, develops, and evaluates an approach to improve the efficiency of ML models. This approach is centered on a Green AI, and the models’ efficiency is a trade-off of accuracy, time to solution, and energy consumption. This leads to a multi-objective optimization problem implemented through the Genetic Algorithms (GA). We present the GA scheme and operators designed for this work focused on the architecture and hyperparameter optimization of ML pipeline, developed to be part of an AutoML solution. GA was evaluated for the XGBoost algorithm and results show the effectiveness of the GA for this multi-objective optimization. Also, it was possible to reduce energy consumption with minimal losses of predictive performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 69.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 89.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Gartner https://www.gartner.com/smarterwithgartner/top-trends-on-the-gartner-hype-cycle-for-artificial-intelligence-2019.

  2. 2.

    https://cran.r-project.org/web/packages/rminer/rminer.pdf.

  3. 3.

    https://archive.ics.uci.edu/ml/datasets/HIGGS.

  4. 4.

    https://www.kaggle.com/saurabhshahane/seoul-bike-trip-duration-prediction.

References

  1. Bender, E.M., Gebru, T., McMillan-Major, A., Shmitchell, S.: On the Dangers of Stochastic Parrots: Can Language Models Be Too Big? Association for Computing Machinery, New York (2021)

    Book  Google Scholar 

  2. Bernardo, F., Yokoyama, A., Schulze, B., Ferro, M.: Avaliação do consumo de energia para o treinamento de aprendizado de máquina utilizando single-board computers baseadas em arm. In: Anais do XXII Simpósio em Sistemas Computacionais de Alto Desempenho, pp. 60–71. SBC, Porto Alegre, RS, Brasil (2021). https://doi.org/10.5753/wscad.2021.18512

  3. Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 785–794 (2016)

    Google Scholar 

  4. Colorni, A., Dorigo, M., Maniezzo, V.: Genetic algorithms and highly constrained problems: the time-table case. In: Schwefel, H.-P., Männer, R. (eds.) PPSN 1990. LNCS, vol. 496, pp. 55–59. Springer, Heidelberg (1991). https://doi.org/10.1007/BFb0029731

    Chapter  Google Scholar 

  5. David, E., Greental, I.: Genetic algorithms for evolving deep neural networks. In: GECCO 2014 - Companion Publication of the 2014 Genetic and Evolutionary Computation Conference, July 2014. https://doi.org/10.1145/2598394.2602287

  6. Doke, A., Gaikwad, M.: Survey on automated machine learning (AutoML) and meta learning. In: 2021 12th International Conference on Computing Communication and Networking Technologies (ICCCNT), pp. 1–5 (2021)

    Google Scholar 

  7. Ferreira, L., Pilastri, A., Martins, C.M., Pires, P.M., Cortez, P.: A comparison of AutoML tools for machine learning, deep learning and XGBoost. In: 2021 International Joint Conference on Neural Networks (IJCNN), pp. 1–8 (2021). https://doi.org/10.1109/IJCNN52387.2021.9534091

  8. Ferro, M., Silva, G.D., de Paula, F.B., Vieira, V., Schulze, B.: Towards a sustainable artificial intelligence: a case study of energy efficiency in decision tree algorithms. Concurrency and Computation: Practice and Experience n/a(n/a), e6815, December 2021. https://doi.org/10.1002/cpe.6815

  9. Feurer, M., Klein, A., Eggensperger, K., Springenberg, J.T., Blum, M., Hutter, F.: Auto-sklearn: efficient and robust automated machine learning. In: Hutter, F., Kotthoff, L., Vanschoren, J. (eds.) Automated Machine Learning. TSSCML, pp. 113–134. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-05318-5_6

    Chapter  Google Scholar 

  10. Ganapathy, K.: A study of genetic algorithms for hyperparameter optimization of neural networks in machine translation (2020)

    Google Scholar 

  11. Goldberg, D.E.: Genetic Algorithms in Search. 1st edn. Optimization and Machine Learning. Addison-Wesley Longman Publishing Co., Inc, USA (1989)

    Google Scholar 

  12. Hamdia, K.M., Zhuang, X., Rabczuk, T.: An efficient optimization approach for designing machine learning models based on genetic algorithm. Neural Comput. Appl. 33(6), 1923–1933 (2021)

    Article  Google Scholar 

  13. He, X., Zhao, K., Chu, X.: AutoML: a survey of the state-of-the-art. Knowl. Based Syst. 212, 106622 (2021). https://doi.org/10.1016/j.knosys.2020.106622

    Article  Google Scholar 

  14. Heffetz, Y., Vainshtein, R., Katz, G., Rokach, L.: DeepLine: AutoML tool for pipelines generation using deep reinforcement learning and hierarchical actions filtering. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 2103–2113. KDD 2020, Association for Computing Machinery, New York, NY, USA (2020). https://doi.org/10.1145/3394486.3403261

  15. Holland, J.H.: Genetic algorithms. Scientific American, July 1992

    Google Scholar 

  16. Jian, W., Zhou, Y., Liu, H.: Densely connected convolutional network optimized by genetic algorithm for fingerprint liveness detection. IEEE Access 9, 2229–2243 (2021). https://doi.org/10.1109/ACCESS.2020.3047723

    Article  Google Scholar 

  17. Johnson, F., Valderrama, A., Valle, C., Crawford, B., Soto, R., \(\tilde{\rm N}\)anculef, R.: Automating configuration of convolutional neural network hyperparameters using genetic algorithm. IEEE Access 8, 156139–156152 (2020). https://doi.org/10.1109/ACCESS.2020.3019245

  18. Kaggle: State of data science and machine learning 2021. Technical report (2021). https://www.kaggle.com/kaggle-survey-2021

  19. LeDell, E., Poirier, S.: H2O AutoML: Scalable automatic machine learning. In: 7th ICML Workshop on Automated Machine Learning (AutoML), July 2020

    Google Scholar 

  20. Lee, S., Kim, J., Kang, H., Kang, D.Y., Park, J.: Genetic algorithm based deep learning neural network structure and hyperparameter optimization. Appl. Sci. (2021). https://doi.org/10.3390/app11020744

    Article  Google Scholar 

  21. Nagarajah, T., Poravi, G.: A review on automated machine learning (AutoML) systems. In: 2019 IEEE 5th International Conference for Convergence in Technology (I2CT), pp. 1–6 (2019). https://doi.org/10.1109/I2CT45611.2019.9033810

  22. Nikitin, N.O., et al.: Automated evolutionary approach for the design of composite machine learning pipelines. Future Gener. Comput. Syst. 127, 109–125 (2022)

    Article  Google Scholar 

  23. Olson, R.S., Bartley, N., Urbanowicz, R.J., Moore, J.H.: Evaluation of a tree-based pipeline optimization tool for automating data science. In: Proceedings of the Genetic and Evolutionary Computation Conference 2016, pp. 485–492. GECCO 2016, Association for Computing Machinery, New York, NY, USA (2016). https://doi.org/10.1145/2908812.2908918

  24. Pfisterer, F., Coors, S., Thomas, J., Bischl, B.: Multi-objective automatic machine learning with AutoXGBoostMC (2019). https://doi.org/10.48550/ARXIV.1908.10796

  25. Polonskaia, I.S., Nikitin, N.O., Revin, I., Vychuzhanin, P., Kalyuzhnaya, A.V.: Multi-objective evolutionary design of composite data-driven models. In: 2021 IEEE Congress on Evolutionary Computation (CEC), pp. 926–933 (2021). https://doi.org/10.1109/CEC45853.2021.9504773

  26. Rani, R., Sharma, A.: An optimized framework for cancer classification using deep learning and genetic algorithm. J. Med. Imaging Health Inform. 7, 1851–1856 (2017). https://doi.org/10.1166/jmihi.2017.2266

  27. Schwartz, R., Dodge, J., Smith, N.A., Etzioni, O.: Green AI. Commun. ACM 63(12), 54–63 (2020)

    Article  Google Scholar 

  28. Strubell, E., Ganesh, A., McCallum, A.: Energy and policy considerations for deep learning in NLP. arXiv preprint arXiv:1906.02243 (2019)

  29. Xiao, X., Yan, M., Basodi, S., Ji, C., Pan, Y.: Efficient hyperparameter optimization in deep learning using a variable length genetic algorithm (2020)

    Google Scholar 

  30. Young, S., Rose, D., Karnowski, T., Lim, S.H., Patton, R.: Optimizing deep learning hyper-parameters through an evolutionary algorithm. In: ACM Proceedings, pp. 1–5 (11 2015)

    Google Scholar 

  31. Yuan, Y., Wang, W., Coghill, G.M., Pang, W.: A novel genetic algorithm with hierarchical evaluation strategy for hyperparameter optimisation of graph neural networks. CoRR abs/2101.09300 (2021). https://arxiv.org/abs/2101.09300

Download references

Acknowledgements

This work is funded by Faperj, CAPES and LNCC-MCTI. Projects GreenAI 21-CLIMAT-07 and SUSAIN Inria Associated Teams.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mariza Ferro .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yokoyama, A.M., Ferro, M., Schulze, B. (2022). A Multi-objective Hyperparameter Optimization for Machine Learning Using Genetic Algorithms: A Green AI Centric Approach. In: Bicharra Garcia, A.C., Ferro, M., Rodríguez Ribón, J.C. (eds) Advances in Artificial Intelligence – IBERAMIA 2022. IBERAMIA 2022. Lecture Notes in Computer Science(), vol 13788. Springer, Cham. https://doi.org/10.1007/978-3-031-22419-5_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-22419-5_12

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-22418-8

  • Online ISBN: 978-3-031-22419-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics