Skip to main content

Parameters Optimization of Support Vector Machine Based on the Optimal Foraging Theory

  • Chapter
  • First Online:
Machine Learning Paradigms: Theory and Application

Abstract

Support Vector Machine (SVM) is one of popular supervised machine learning algorithms, which can be used for both regression or classification challenges. The operation of SVM algorithm is based on finding the optimal hyperplane to discriminate between different classes. This hyperplane is known as kernel. In SVM, penalty parameter C and \(\sigma \) parameter of Radial Basis Function (RBF) can have a significant impact on the complexity and performance of SVM. Usually these parameters are randomly chosen. However, SVM is highly needed to determine the optimal parameters values to obtain expected learning performance. In this chapter, an optimization method based on optimal foraging theory is proposed to adjust the two main parameters of gaussian kernel function of SVM to increase the classification accuracy. Six well-known benchmark datasets taken from UCI machine learning data repository were employed for evaluating the proposed (OFA-SVM). In addition, the performance of the proposed optimal foraging algorithm for SVM’s parameters optimization (OFA-SVM) is compared with five other well-known and recently meta-heuristic optimization algorithms. These algorithms are Bat Algorithm (BA), Genetic Algorithm (GA), Artificial Bee Colony (ABC), Chicken Swarm Optimization (CSO) and Particle Swarm Optimization (PSO). The experimental results show that the proposed OFA-SVM can achieve better results compared with the other algorithms. Moreover, the results demonstrate the capability of the proposed OFA-SVM in finding the optimal parameters values of RBF of SVM.

G. I. Sayed, M. Soliman, A. E. Hassanien—Scientific Research Group in Egypt (SRGE), http://www.egyptscience.net.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Vapnik, V.: The nature of statistical learning theory. Informat. Sci. Stat. Springer, New York (1995)

    Google Scholar 

  2. Lin, S., Ying, K., Chen, S., Lee, Z.: Evolutionary tuning of svm parameter values in multiclass problems. Neurocomputing 71(4), 3326–3334 (2008)

    Google Scholar 

  3. Luo, Z., Zhang, W., Li, Y., Xiang, M.: Svm parameters tuning with quantum particles swarm optimization. In: IEEE Confernce on Cybernetics and Intelligent Systems, pp. 183–187, Chengdu, China (2008)

    Google Scholar 

  4. Sayed, G., Ali, M., Gaber, T., Hassanien, A., Sansel, V.: Interphase cells removal from metaphase chromosome images based on meta-heuristic grey wolf optimizer. In: 11th International Computer Engineering Conference (ICENCO). IEEE, pp. 261–266. Egypt, Cairo (2015)

    Google Scholar 

  5. Muller, K.R., Mika, S., Ratsch, G., Tsuda, K., Scholkopf, B.: An introduction to kernel-based learning algorithms. IEEE Trans. Neural Netw. 12(2), 181–201 (2001)

    Article  Google Scholar 

  6. Chapelle, O., Vapnik, V., Bousquet, O., Mukherjee, S.: Choosing multiple parameters for support vector machines. Mach. Learn. 46(1–3), 131–159 (2002)

    Article  Google Scholar 

  7. Sayed, G., Hassanien, A., Schaefer, G.: An automated computer-aided diagnosis system for abdominal ct liver images. In: The 20th Annual Conference in Medical Image Understanding and Analysis (MIUA 2016). Elsevier, vol. 90, pp. 68–73. Loughborough University, Loughborough, UK (2016)

    Google Scholar 

  8. Friedrichs, F., Igel, C.: Evolutionary tuning of multiple SVM parameters. Neurocomputing 64, 107–117 (2005)

    Google Scholar 

  9. Staelin, C.: Parameter Selection for Support Vector Machines, vol. 12 (2003)

    Google Scholar 

  10. Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines and other kernel-based learning methods. Cambridge University Press, New York (2000)

    Book  Google Scholar 

  11. Zhang, L., Wang, J.: Optimizing parameters of support vector machines using team-search-based particle swarm optimization. Eng. Comput. 32(5), 1194–1213 (2015)

    Google Scholar 

  12. Daubechies, I., Mallat, S., Willsky, A.S.: Introduction to the special issue on wavelet transforms and multiresolution signal analysis. IEEE Trans. Informat. Theor. 38(2), 529–532 (1992)

    Google Scholar 

  13. Mouhamed, M.R., Zawbaa, H.M., Al-Shammari, E., Hassanien, A.E., Snasel, V.: Blind watermark approach for map authentication using support vector machine. In: International Conference on Advances in Security of Information and Communication Networks, pp. 84–97 (2013)

    Google Scholar 

  14. Sinervo, B.: Optimal Foraging Theory: Constraints and Cognitive Processes, Chapter 6, pp. 105–130. Behavioral Ecology. University of California, Santa Cruz. (1997)

    Google Scholar 

  15. Krebs, J.R., Erichsen, J.T., Webber, M.I.: Optimal prey selection in the great tits (parus major). Anim. Behav. 25(1), 30–38 (1977)

    Article  Google Scholar 

  16. Zhu, G., Zhang, W.: Optimal foraging algorithm for global optimization. 51, 294–313, 12 (2016)

    Google Scholar 

  17. Pyke, G.H., Pulliam, H.R., Charnov, E.L.: Optimal foraging: a selective review of theory and tests. Q. Rev. Biol. 52(2), 37–154 (1977)

    Article  Google Scholar 

  18. Tharwat, A., Gabel, T., Hassanien, A.E.: Parameter optimization of support vector machine using dragonfly algorithm. In: Proceedings of the International Conference on Advanced Intelligent Systems and Informatics, pp. 309–319, Egypt (2017)

    Google Scholar 

  19. Sayed, G., Hassanien, A., Kim, T.: Interphase cells removal from metaphase chromosome images based on meta-heuristic grey wolf optimizer. 11th International Computer Engineering Conference (ICENCO). IEEE, pp. 261–266. Egypt, Cairo (2015)

    Google Scholar 

  20. Sayed, G., Darwish, A., Hassanien, A.: Quantum multiverse optimization algorithm for optimization problems. Neural Comput. Appl. 1–18 (2017)

    Google Scholar 

  21. Bache, K., Lichman, M.: UCI Machine Learning Repository. http://archive.ics.uci.edu/ml

  22. Sayed, G., Soliman, M., Hassanien, A.: Medical Imaging in Clinical Applications, Series Studies in Computational Intelligence, volume 651, chapter Bio-inspired Swarm Techniques for Thermogram Breast Cancer Detection, pp. 487–506. Springer International Publishing Switzerland (2016)

    Google Scholar 

  23. Sayed, G., Khoriba, G., Haggag, M.: A novel chaotic salp swarm algorithm for global optimization and feature selection. Appl. Intell. 1–20 (2018)

    Google Scholar 

  24. Wang, H., Zhang, H., Cang, S., Liao, W., Zhu, F.: Parameters optimization of classifier and feature selection based on improved artificial bee colony algorithm. In: Proceedings of the International Conference on Advanced Mechatronic Systems, pp. 242–247, Melbourne, Australia (2016)

    Google Scholar 

  25. Huang, C., Wang, C.: A GA-based feature selection and parameters optimization for support vector machines. Exp. Syst. Appl. 31(2), 231–240 (2006)

    Google Scholar 

  26. Taie, S., Ghonaim, W.: Title CSO-based algorithm with support vector machine for brain tumar’s disease diagnosis. In: IEEE International Conference on Persasive Computing and Communications Workshops, pp. 183–187, Kona, USA (2017)

    Google Scholar 

  27. Lin, S., Ying, K., Chen, S., Lee, Z.: Particle swarm optimization for parameter determination and feature selection of support vector machines. Exp. Syst. Appl. 35(4), 1817–1824 (2008)

    Google Scholar 

  28. Taie, S., Ghonaim, W.: Adjusted bat algorithm for tuning of support vector machine parameters. In: IEEE Congress on Evolutionary Computation (CEC), pp. 2225–2232, Vancouver, Canada (2016)

    Google Scholar 

  29. Demśar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

  30. Sayed, G., Hassanien, A., Azar, A.: Feature selection via a novel chaotic crow search algorithm. Neural Comput. Appl. 1–18 (2017)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gehad Ismail Sayed .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Sayed, G.I., Soliman, M., Hassanien, A.E. (2019). Parameters Optimization of Support Vector Machine Based on the Optimal Foraging Theory. In: Hassanien, A. (eds) Machine Learning Paradigms: Theory and Application. Studies in Computational Intelligence, vol 801. Springer, Cham. https://doi.org/10.1007/978-3-030-02357-7_15

Download citation

Publish with us

Policies and ethics