Skip to main content

Joint Feature Selection and Classifier Parameter Optimization: A Bio-Inspired Approach

  • Conference paper
  • First Online:
Knowledge Science, Engineering and Management (KSEM 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14117))

  • 782 Accesses

Abstract

Feature selection has been proven to be an effective method for handling large amounts of data, which together with the parameter settings of the classifier determines the performance of the classifier. However, many studies have considered the two separately, ignoring the intrinsic connection between them. Thus, in this work, we formulate a joint feature selection and parameters optimization problem, which is NP-hard and mixed-variable structured. Then we propose an improved binary honey badger algorithm (IBHBA) to solve the formulated problem. First, a novel initialization strategy based on the fast correlation-based filter (FCBF) method is proposed to generate promising initial solutions. Second, IBHBA introduces a local search factor based on simulated annealing (SA), a crossover operator based on tournament selection, and a mutation mechanism to improve the performance of conventional HBA. Finally, a binary mechanism is adopted to make it suitable for the feature selection problem. Experiments conducted in 27 public datasets have demonstrated that the proposed approach can outperform some well-known swarm-based algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Agrawal, R.K., Kaur, B., Sharma, S.: Quantum based whale optimization algorithm for wrapper feature selection. Appl. Soft Comput. 89, 106092 (2020)

    Article  Google Scholar 

  2. Blake, C.: UCI repository of machine learning databases. http://www.ics.uci.edu/~mlearn/MLRepository.html (1998)

  3. Cervantes, J., García-Lamont, F., Rodríguez-Mazahua, L., López Chau, A.: A comprehensive survey on support vector machine classification: applications, challenges and trends. Neurocomputing 408, 189–215 (2020)

    Article  Google Scholar 

  4. Chen, H., Zhang, Z., Yin, W., Zhao, C., Wang, F., Li, Y.: A study on depth classification of defects by machine learning based on hyper-parameter search. Measurement 189, 110660 (2022)

    Article  Google Scholar 

  5. Dai, Y., Zhao, P.: A hybrid load forecasting model based on support vector machine with intelligent methods for feature selection and parameter optimization. Appl. Energy 279, 115332 (2020)

    Article  Google Scholar 

  6. Deng, Y., Guan, D., Chen, Y., Yuan, W., Ji, J., Wei, M.: SAR-ShipNet: SAR-Ship detection neural network via bidirectional coordinate attention and multi-resolution feature fusion. In: Proceedings IEEE ICASSP, pp. 3973–3977 (2022)

    Google Scholar 

  7. Dhal, P., Azad, C.: A comprehensive survey on feature selection in the various fields of machine learning. Appl. Intell. 52(4), 4543–4581 (2022)

    Article  Google Scholar 

  8. Elgamal, Z.M., Sabri, A.Q.M., Tubishat, M., Tbaishat, D., Makhadmeh, S.N., Alomari, O.A.: Improved reptile search optimization algorithm using chaotic map and simulated annealing for feature selection in medical field. IEEE Access 10, 51428–51446 (2022)

    Article  Google Scholar 

  9. Ghanem, W.A.H.M., et al.: Cyber intrusion detection system based on a multiobjective binary bat algorithm for feature selection and enhanced bat algorithm for parameter optimization in neural networks. IEEE Access 10, 76318–76339 (2022)

    Article  Google Scholar 

  10. Ghosh, S., Dasgupta, A., Swetapadma, A.: A study on support vector machine based linear and non-linear pattern classification. In: Proceedings IEEE ICISS, pp. 24–28 (2019)

    Google Scholar 

  11. Guan, D., et al.: A novel class noise detection method for high-dimensional data in industrial informatics. IEEE Trans. Ind. Informatics 17(3), 2181–2190 (2021)

    Article  Google Scholar 

  12. Hashim, F.A., Houssein, E.H., Hussain, K., Mabrouk, M.S., Al-Atabany, W.: Honey badger algorithm: new metaheuristic algorithm for solving optimization problems. Math. Comput. Simul. 192, 84–110 (2022)

    Article  MathSciNet  MATH  Google Scholar 

  13. Karn, R.R.P., et al.: A feature and parameter selection approach for visual domain adaptation using particle swarm optimization. In: Proceedings IEEE CEC, pp. 1–7. IEEE (2022)

    Google Scholar 

  14. Karthikeyan, R., Alli, P.: Feature selection and parameters optimization of support vector machines based on hybrid glowworm swarm optimization for classification of diabetic retinopathy. J. Med. Syst. 42(10), 1–11 (2018)

    Google Scholar 

  15. Li, A., Xue, B., Zhang, M.: Multi-objective feature selection using hybridization of a genetic algorithm and direct multisearch for key quality characteristic selection. Inf. Sci. 523, 245–265 (2020)

    Article  MathSciNet  Google Scholar 

  16. Mehedi, I.M., et al.: Optimal feature selection using modified cuckoo search for classification of power quality disturbances. Appl. Soft Comput. 113(Part), 107897 (2021)

    Google Scholar 

  17. Mirjalili, S., Lewis, A.: S-shaped versus v-shaped transfer functions for binary particle swarm optimization. Swarm Evol. Comput. 9, 1–14 (2013)

    Article  Google Scholar 

  18. Ren, W., Ma, D., Han, M.: Multivariate time series predictor with parameter optimization and feature selection based on modified binary salp swarm algorithm. IEEE Trans. Indust. Inf. 19, 6150–6159 (2022)

    Google Scholar 

  19. Sakri, S.B., Rashid, N.B.A., Zain, Z.M.: Particle swarm optimization feature selection for breast cancer recurrence prediction. IEEE Access 6, 29637–29647 (2018)

    Article  Google Scholar 

  20. Too, J., Sadiq, A.S., Mirjalili, S.M.: A conditional opposition-based particle swarm optimisation for feature selection. Connect. Sci. 34(1), 339–361 (2022)

    Article  Google Scholar 

  21. Tubishat, M., et al.: Dynamic salp swarm algorithm for feature selection. Expert Syst. Appl. 164, 113873 (2021)

    Article  Google Scholar 

  22. Yu, L., Liu, H.: Feature selection for high-dimensional data: a fast correlation-based filter solution. In: Fawcett, T., Mishra, N. (eds.) Proceedings ICML, pp. 856–863. AAAI Press (2003)

    Google Scholar 

  23. Zhang, Y., Liu, R., Wang, X., Chen, H., Li, C.: Boosted binary Harris Hawks optimizer and feature selection. Eng. Comput. 37(4), 3741–3770 (2021)

    Article  Google Scholar 

  24. Zhou, T., Lu, H., Wenwen, W., Xia, Y.: GA-SVM based feature selection and parameter optimization in hospitalization expense modeling. Appl. Soft Comput. 75, 323–332 (2019)

    Article  Google Scholar 

  25. Zou, L., Zhou, S., Li, X.: An efficient improved greedy Harris Hawks optimizer and its application to feature selection. Entropy 24(8), 1065 (2022)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

This work is supported in part by the National Key Research and Development Program of China (2022YFB4500600), in part by the National Natural Science Foundation of China (61872158, 62002133, 62172186, 62272194), in part by the Science and Technology Development Plan Project of Jilin Province (20200201166JC, 20190701019GH, 20190701002GH), in part by Graduate Innovation Fund of Jilin University (2022028, 2022155, 2023CX013), and in part by the Excellent Young Talents Program for Department of Science and Technology of Jilin Province (Grant 20190103051JH).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Geng Sun or Jiahui Li .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wei, Z. et al. (2023). Joint Feature Selection and Classifier Parameter Optimization: A Bio-Inspired Approach. In: Jin, Z., Jiang, Y., Buchmann, R.A., Bi, Y., Ghiran, AM., Ma, W. (eds) Knowledge Science, Engineering and Management. KSEM 2023. Lecture Notes in Computer Science(), vol 14117. Springer, Cham. https://doi.org/10.1007/978-3-031-40283-8_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-40283-8_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-40282-1

  • Online ISBN: 978-3-031-40283-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics