Skip to main content

Advertisement

Log in

A lightweight knowledge-based PSO for SVM hyper-parameters tuning in a dynamic environment

  • Published:
The Journal of Supercomputing Aims and scope Submit manuscript

Abstract

Hyper-parameter optimization is a crucial task for designing kernel-based machine learning models. Their values can be set by using various optimization algorithms. But a data-dependent objective function makes hyper-parameter’s configuration changes over time in a dynamic environment. A dynamic environment is an environment where training data keep getting added continuously over time. To find the optimum values for the hyper-parameters in such an environment, one needs to run the optimization algorithm repeatedly over time. But due to the dependency of the objective function on data, the average time complexity of the optimization process increases. This paper work proposes a novel knowledge-based approach that uses particle swarm optimization (PSO) as the base optimization algorithm to optimize the hyper-parameters of support vector machine. We have introduced two major modules for designing this framework—a knowledge transfer module and a drift detection module. The knowledge transfer module in our proposed framework generates knowledge by running PSO and transfers this knowledge to the consequent time instances. On the other hand, the drift detection module is responsible for detecting changes in the objective function when new data get added to the existing data. This drift detection module helps to utilize the transferred knowledge at a particular time instance to reduce the execution time of the overall optimization process. The proposed framework has been evaluated using various standard datasets such as Adult, DNA, Nist-Digits, Segment, Splice, Mushroom and Usps for five consecutive time instances and found the average execution time as 30.72 s, which is better than the execution time 36.89 s recorded using general PSO. We have also found that our proposed framework performs the optimization of hyper-parameters much faster than the other existing approaches such as grid search, chained-PSO, dynamic model selection and quantized dynamic multi-PSO.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Data availability

Not applicable.

References

  1. Yang L, Shami A (2020) On hyperparameter optimization of machine learning algorithms: theory and practice. Neurocomputing 415:295–316

    Article  Google Scholar 

  2. Kalita DJ, Singh VP, Kumar V (2020) A survey on SVM hyper-parameters optimization techniques. In: Social networking and computational intelligence: proceedings of SCI-2018. Springer Singapore. (pp. 243-256)

  3. Eberhart R, Kennedy J (1995) Particle swarm optimization. In: Proceedings of the IEEE International Conference on Neural Networks, (Vol. 4, pp. 1942–1948)

  4. Vapnik V, Vapnik V (1998) Statistical learning theory (pp. 156–160)

  5. Cervantes J, Garcia-Lamont F, Rodríguez-Mazahua L, Lopez A (2020) A comprehensive survey on support vector machine classification: applications, challenges and trends. Neurocomputing 408:189–215

    Article  Google Scholar 

  6. Feurer M, Hutter F (2019) Hyperparameter optimization. In: Automated machine learning. Springer, Cham. (pp. 3–33)

  7. Kalita DJ, Singh S (2020) SVM hyper-parameters optimization using quantized multi-PSO in dynamic environment. Soft Comput 24(2):1225–1241

    Article  Google Scholar 

  8. Kalita DJ, Singh VP, Kumar V (2021) A dynamic framework for tuning SVM hyper parameters based on moth-flame optimization and knowledge-based-search. Expert Syst Appl 168:114139

    Article  Google Scholar 

  9. Refaeilzadeh P, Tang L, Liu H (2009) Cross-validation. Encyclop Database Syst 5:532–538

    Article  Google Scholar 

  10. Chapelle O, Vapnik V, Bousquet O, Mukherjee S (2002) Choosing multiple parameters for support vector machines. Mach Learn 46(1–3):131–159

    Article  MATH  Google Scholar 

  11. Ayat N, Cheriet M, Suen C (2005) Automatic model selection for the optimization of SVM kernels. Patt Recogn 38(10):1733–1745

    Article  Google Scholar 

  12. Alibrahim H, Ludwig SA (2021) Hyperparameter optimization: Comparing genetic algorithm against grid search and bayesian optimization. In: 2021 IEEE congress on evolutionary computation (CEC), IEEE, pp. 1551–1559

  13. Huang CM, Lee YJ, Lin DK, Huang SY (2007) Model selection for support vector machines via uniform design. Comput Stat Data Anal 52(1):335–346

    Article  MathSciNet  MATH  Google Scholar 

  14. Huang Q, Mao J, Liu Y (2012) An improved grid search algorithm of SVR parameters optimization. In: 2012 IEEE 14th International Conference on Communication Technology, IEEE, pp. 1022–1026

  15. Chunhong Z, Licheng J (2004) Automatic parameters selection for SVM based on GA. In: Proceedings of the 5th world congress on intelligent control and automation, pp. 1869–1872

  16. Cohen G, Hilario M, Geissbuhler A (2004) Model selection for support vector classifiers via genetic algorithms. An application to medical decision support. In: Proceedings of the 5th international symposium on biological and medical data analysis, pp.200–211

  17. Suttorp T, Igel C (2006) Multi-objective optimization of support vector machines. Multi-Object Mach Learn. https://doi.org/10.1007/3-540-33019-4_9

    Article  Google Scholar 

  18. Friedrichs F, Igel C (2004) Evolutionary tuning of multiple SVM parameters, In: Proceedings of the 12th european symposium on artificial neural networks, pp- 519–524

  19. Lin SW, Lee ZJ, Chen SC, Tseng TY (2008) Parameter determination of support vector machine and feature selection using simulated annealing approach. Appl Soft Comput 8(4):1505–1512

    Article  Google Scholar 

  20. Rojas-Domínguez A, Padierna LC, Valadez JMC, Puga-Soberanes HJ, Fraire HJ (2017) Optimal hyper-parameter tuning of SVM classifiers with application to medical diagnosis. IEEE Access 6:7164–7176

    Article  Google Scholar 

  21. Liu X, Fu H (2014) PSO-based support vector machine with cuckoo search technique for clinical disease diagnoses. Sci World J. https://doi.org/10.1155/2014/548483

    Article  Google Scholar 

  22. Zhang X, Chen X, He Z (2010) An ACO-based algorithm for parameter optimization of support vector machines. Expert Syst Appl 37(9):6618–6628

    Article  Google Scholar 

  23. Dioşan L, Rogozan A, Pecuchet JP (2012) Improving classification performance of support vector machine by genetically optimising kernel shape and hyper-parameters. Appl Intell 36(2):280–294

    Article  Google Scholar 

  24. Candelieri A, Giordani I, Archetti F, Barkalov K, Meyerov I, Polovinkin A, Zolotykh N (2019) Tuning hyperparameters of a SVM-based water demand forecasting system through parallel global optimization. Comput Op Res 106:202–209

    Article  MathSciNet  Google Scholar 

  25. Wu CH, Tzeng GH, Goo YJ, Fang WC (2007) A real-valued genetic algorithm to optimize the parameters of support vector machine for predicting bankruptcy. Expert Syst Appl 32(2):397–408

    Article  Google Scholar 

  26. Phan AV, Le Nguyen M, Bui LT (2017) Feature weighting and SVM parameters optimization based on genetic algorithms for classification problems. Appl Intell 46(2):455–469

    Article  Google Scholar 

  27. Tharwat A, Hassanien AE (2018) Chaotic antlion algorithm for parameter optimization of support vector machine. Appl Intell 48(3):670–686

    Article  Google Scholar 

  28. Rai P, Daum´ e H, Venkatasubramanian S (2009) Streamed learning: One-pass SVMs. In: Twenty-First International Joint Conference on Artificial Intelligence

  29. Kapp MN, Sabourin R, Maupin P (2012) A dynamic model selection strategy for support vector machine classifiers. Appl Soft Comput 12(8):2550–2565

    Article  Google Scholar 

  30. Li J, Chen X (2012) Online learning algorithm of direct support vector machine for regression based on matrix operation. In: Zhang T (ed) Instrumentation, measurement, circuits and systems. Springer, Berlin

    Google Scholar 

  31. Hitam NA, Ismail AR, Saeed F (2019) An optimized support vector machine (SVM) based on particle swarm optimization (PSO) for cryptocurrency forecasting. Procedia Comput Sci 163:427–433

    Article  Google Scholar 

  32. Kalita DJ, Singh VP, Kumar V (2020) SVM hyper-parameters optimization using multi-PSO for intrusion detection. In: Social networking and computational intelligence: proceedings of SCI-2018. Springer Singapore, pp. 227–241

  33. Sudheer C, Maheswaran R, Panigrahi BK, Mathur S (2014) A hybrid SVM-PSO model for forecasting monthly streamflow. Neural Comput Appl 24:1381–1389

    Article  Google Scholar 

  34. Black M, Hickey RJ (1999) Maintaining the performance of a learned classifier under concept drift. Intell Data Anal 3(6):453–474

    Article  Google Scholar 

  35. Last M (2002) Online classification of nonstationary data streams. Intell Data Anal 6(2):129–147

    Article  MATH  Google Scholar 

  36. Cohen L, Avrahami-Bakish G, Last M, Kandel A, Kipersztok O (2008) Real-time data mining of non-stationary data streams from sensor networks. Inform Fusion 9(3):344–353

    Article  Google Scholar 

Download references

Acknowledgements

Not applicable.

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Contributions

DJK was involved in conceptualization, methodology, software, investigation and formal analysis. VPS was involved in conceptualization, methodology, validation and supervision. VK was involved in resources, supervision and validation.

Corresponding author

Correspondence to Vibhav Prakash Singh.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals by any authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kalita, D.J., Singh, V.P. & Kumar, V. A lightweight knowledge-based PSO for SVM hyper-parameters tuning in a dynamic environment. J Supercomput 79, 18777–18799 (2023). https://doi.org/10.1007/s11227-023-05385-y

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11227-023-05385-y

Keywords

Navigation