skip to main content
10.1145/3411408.3411438acmotherconferencesArticle/Chapter ViewAbstractPublication PagessetnConference Proceedingsconference-collections
research-article

Sonar Inspired Optimization based Feature Selection

Published:02 September 2020Publication History

ABSTRACT

One of the problems that Machine Learning (ML) algorithms face in classification tasks is the Curse of Dimensionality, which refers to the sensitivity of their performance to the data dimensionality. The solution to this problem is to select the features that are of higher importance for the model produced. To improve the performance of a classifier, various meta-heuristic algorithms have been implemented due to their ability to provide optimal solutions in problems that there are multiple candidate solutions, such as in Feature Selection (FS). In this study, Sonar Inspired Optimization (SIO) algorithm is used to perform FS in order to improve the performance of a state-of-the-art classifier, i.e. k-NN. SIO’s performance is compared with other nature-inspired meta-heuristic algorithms that have been used for the same task.

References

  1. Sobhi Ahmed, Majdi Mafarja, Hossam Faris, and Ibrahim Aljarah. 2018. Feature Selection Using Salp Swarm Algorithm with Chaos. In Proceedings of the 2nd International Conference on Intelligent Systems, Metaheuristics & Swarm Intelligence(Phuket, Thailand) (ISMSI ’18). Association for Computing Machinery, New York, NY, USA, 65–69. https://doi.org/10.1145/3206185.3206198Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. A. Asuncion and D.J. Newman. 2007. UCI Machine Learning Repository. http://www.ics.uci.edu/ ∼ mlearn/MLRepository.htmlGoogle ScholarGoogle Scholar
  3. Gavin C. Cawley and Nicola L. C. Talbot. 2010. On Over-fitting in Model Selection and Subsequent Selection Bias in Performance Evaluation. Journal of Machine Learning Research 11, 70 (2010), 2079–2107. http://jmlr.org/papers/v11/cawley10a.htmlGoogle ScholarGoogle ScholarDigital LibraryDigital Library
  4. Bradley Efron. 1983. Estimating the error rate of a prediction rule: improvement on cross-validation. Journal of the American statistical association 78, 382(1983), 316–331. Publisher: Taylor & Francis.Google ScholarGoogle ScholarCross RefCross Ref
  5. Ahmed A Ewees, Mohamed Abd El Aziz, and Aboul Ella Hassanien. 2019. Chaotic multi-verse optimizer-based feature selection. Neural computing and applications 31, 4 (2019), 991–1006.Google ScholarGoogle Scholar
  6. Evelyn Fix and J. L. Hodges. 1989. Discriminatory Analysis. Nonparametric Discrimination: Consistency Properties. International Statistical Review / Revue Internationale de Statistique 57, 3(1989), 238–247. http://www.jstor.org/stable/1403797Google ScholarGoogle ScholarCross RefCross Ref
  7. Sébastien Gadat and Laurent Younes. 2007. A stochastic algorithm for feature selection in pattern recognition. Journal of Machine Learning Research 8, Mar (2007), 509–547.Google ScholarGoogle Scholar
  8. Saptarsi Goswami, Sanjay Chakraborty, Priyanka Guha, Arunabha Tarafdar, and Aman Kedia. 2019. Filter-Based Feature Selection Methods Using Hill Climbing Approach. In Natural Computing for Unsupervised Learning, Xiangtao Li and Ka-Chun Wong (Eds.). Springer International Publishing, Cham, 213–234. https://doi.org/10.1007/978-3-319-98566-4_10Google ScholarGoogle Scholar
  9. Deepak Gupta, Ashish Khanna, Lakshmanaprabu SK, K. Shankar, Vasco Furtado, and Joel J. P. C. Rodrigues. 2019. Efficient artificial fish swarm based clustering approach on mobility aware energy-efficient for MANET. Transactions on Emerging Telecommunications Technologies 30, 9(2019), e3524. https://doi.org/10.1002/ett.3524 arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1002/ett.3524e3524 ETT-18-0306.R1.Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Ruba Abu Khurma, Ibrahim Aljarah, Ahmad Sharieh, and Seyedali Mirjalili. 2020. EvoloPy-FS: An Open-Source Nature-Inspired Optimization Framework in Python for Feature Selection. In Evolutionary Machine Learning Techniques: Algorithms and Applications, Seyedali Mirjalili, Hossam Faris, and Ibrahim Aljarah(Eds.). Springer Singapore, Singapore, 131–173. https://doi.org/10.1007/978-981-32-9990-0_8Google ScholarGoogle Scholar
  11. Ron Kohavi, George H John, 1997. Wrappers for feature subset selection. Artificial intelligence 97, 1-2 (1997), 273–324.Google ScholarGoogle Scholar
  12. Nicolas Labroche, Nicolas Monmarché, and Gilles Venturini. 2003. Visual Clustering with Artificial Ants Colonies. In Knowledge-Based Intelligent Information and Engineering Systems, Vasile Palade, Robert J. Howlett, and Lakhmi Jain(Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 332–338.Google ScholarGoogle Scholar
  13. Xavier Lurton. 2010. An Introduction to Underwater Acoustics: Principles and Applications (2 ed.). Springer-Verlag, Berlin Heidelberg. https://www.springer.com/gp/book/9783540784807Google ScholarGoogle Scholar
  14. Majdi Mafarja and Seyedali Mirjalili. 2018. Whale optimization approaches for wrapper feature selection. Applied Soft Computing 62 (2018), 441 – 453. https://doi.org/10.1016/j.asoc.2017.11.006Google ScholarGoogle ScholarCross RefCross Ref
  15. Martin Nilsson and Nigel Snoad. 2002. Optimal mutation rates in dynamic environments. Bulletin of Mathematical Biology 64, 6 (Nov. 2002), 1033. https://doi.org/10.1006/bulm.2002.0314Google ScholarGoogle ScholarCross RefCross Ref
  16. Camelia-M. Pintea and Sorin V. Sabau. 2011. Correlations Involved in a Bio-inspired Classification Technique. Springer Berlin Heidelberg, Berlin, Heidelberg, 239–246. https://doi.org/10.1007/978-3-642-24094-2_17Google ScholarGoogle Scholar
  17. Cullen Schaffer. 1993. Overfitting avoidance as bias. Machine Learning 10, 2 (Feb. 1993), 153–178. https://doi.org/10.1007/BF00993504Google ScholarGoogle ScholarCross RefCross Ref
  18. Marwa Sharawi, Hossam M Zawbaa, and Eid Emary. 2017. Feature selection approach based on whale optimization algorithm. In 2017 Ninth International Conference on Advanced Computational Intelligence (ICACI). IEEE, 163–168.Google ScholarGoogle ScholarCross RefCross Ref
  19. Mohammad Taradeh, Majdi Mafarja, Ali Asghar Heidari, Hossam Faris, Ibrahim Aljarah, Seyedali Mirjalili, and Hamido Fujita. 2019. An evolutionary gravitational search-based feature selection. Information Sciences 497(2019), 219–239.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Alexandros Tzanetos. 2020. Sonar Inspired Optimization based Feature Selection. https://de.mathworks.com/matlabcentral/fileexchange/78426-sonar-inspired-optimization-based-feature-selection Library Catalog: MATLAB Central File Exchange. Retrieved July 20, 2020.Google ScholarGoogle Scholar
  21. Alexandros Tzanetos and Georgios Dounias. 2017. A New Metaheuristic Method for Optimization: Sonar Inspired Optimization. In Engineering Applications of Neural Networks, Giacomo Boracchi, Lazaros Iliadis, Chrisina Jayne, and Aristidis Likas (Eds.). Springer International Publishing, 417–428.Google ScholarGoogle Scholar
  22. Alexandros Tzanetos and Georgios Dounias. 2018. Sonar inspired optimization (SIO) in engineering applications. Evolving Systems (2018), 1–9. https://doi.org/10.1007/s12530-018-9250-zGoogle ScholarGoogle Scholar
  23. Alexandros Tzanetos and Georgios Dounias. 2020. Sonar Inspired Optimization in Energy Problems Related to Load and Emission Dispatch. In Learning and Intelligent Optimization, Nikolaos F. Matsatsinis, Yannis Marinakis, and Panos Pardalos(Eds.). Springer International Publishing, Cham, 268–283. https://doi.org/10.1007/978-3-030-38629-0_22Google ScholarGoogle Scholar
  24. Alexandros Tzanetos, Christos Kyriklidis, Anastasia Papamichail, Apostolis Dimoulakis, and Georgios Dounias. 2018. A Nature Inspired Metaheuristic for Optimal Leveling of Resources in Project Management. In Proceedings of the 10th Hellenic Conference on Artificial Intelligence (Patras, Greece) (SETN ’18). Association for Computing Machinery, New York, NY, USA, Article 17, 7 pages. https://doi.org/10.1145/3200947.3201014Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. D. W. van der Merwe and A. P. Engelbrecht. 2003. Data clustering using particle swarm optimization. In The 2003 Congress on Evolutionary Computation, 2003. CEC ’03., Vol. 1. 215–220 Vol.1. https://doi.org/10.1109/CEC.2003.1299577Google ScholarGoogle Scholar
  26. H. Zhou, Z. Su, and Y. Hao. 2019. Multi-Objective Evolutionary Evidential C-Means Clustering. In 2019 International Conference on Electronic Engineering and Informatics (EEI). 426–429. https://doi.org/10.1109/EEI48997.2019.00098Google ScholarGoogle Scholar

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Other conferences
    SETN 2020: 11th Hellenic Conference on Artificial Intelligence
    September 2020
    249 pages

    Copyright © 2020 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 2 September 2020

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article
    • Research
    • Refereed limited
  • Article Metrics

    • Downloads (Last 12 months)2
    • Downloads (Last 6 weeks)0

    Other Metrics

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format