ABSTRACT
One of the problems that Machine Learning (ML) algorithms face in classification tasks is the Curse of Dimensionality, which refers to the sensitivity of their performance to the data dimensionality. The solution to this problem is to select the features that are of higher importance for the model produced. To improve the performance of a classifier, various meta-heuristic algorithms have been implemented due to their ability to provide optimal solutions in problems that there are multiple candidate solutions, such as in Feature Selection (FS). In this study, Sonar Inspired Optimization (SIO) algorithm is used to perform FS in order to improve the performance of a state-of-the-art classifier, i.e. k-NN. SIO’s performance is compared with other nature-inspired meta-heuristic algorithms that have been used for the same task.
- Sobhi Ahmed, Majdi Mafarja, Hossam Faris, and Ibrahim Aljarah. 2018. Feature Selection Using Salp Swarm Algorithm with Chaos. In Proceedings of the 2nd International Conference on Intelligent Systems, Metaheuristics & Swarm Intelligence(Phuket, Thailand) (ISMSI ’18). Association for Computing Machinery, New York, NY, USA, 65–69. https://doi.org/10.1145/3206185.3206198Google ScholarDigital Library
- A. Asuncion and D.J. Newman. 2007. UCI Machine Learning Repository. http://www.ics.uci.edu/ ∼ mlearn/MLRepository.htmlGoogle Scholar
- Gavin C. Cawley and Nicola L. C. Talbot. 2010. On Over-fitting in Model Selection and Subsequent Selection Bias in Performance Evaluation. Journal of Machine Learning Research 11, 70 (2010), 2079–2107. http://jmlr.org/papers/v11/cawley10a.htmlGoogle ScholarDigital Library
- Bradley Efron. 1983. Estimating the error rate of a prediction rule: improvement on cross-validation. Journal of the American statistical association 78, 382(1983), 316–331. Publisher: Taylor & Francis.Google ScholarCross Ref
- Ahmed A Ewees, Mohamed Abd El Aziz, and Aboul Ella Hassanien. 2019. Chaotic multi-verse optimizer-based feature selection. Neural computing and applications 31, 4 (2019), 991–1006.Google Scholar
- Evelyn Fix and J. L. Hodges. 1989. Discriminatory Analysis. Nonparametric Discrimination: Consistency Properties. International Statistical Review / Revue Internationale de Statistique 57, 3(1989), 238–247. http://www.jstor.org/stable/1403797Google ScholarCross Ref
- Sébastien Gadat and Laurent Younes. 2007. A stochastic algorithm for feature selection in pattern recognition. Journal of Machine Learning Research 8, Mar (2007), 509–547.Google Scholar
- Saptarsi Goswami, Sanjay Chakraborty, Priyanka Guha, Arunabha Tarafdar, and Aman Kedia. 2019. Filter-Based Feature Selection Methods Using Hill Climbing Approach. In Natural Computing for Unsupervised Learning, Xiangtao Li and Ka-Chun Wong (Eds.). Springer International Publishing, Cham, 213–234. https://doi.org/10.1007/978-3-319-98566-4_10Google Scholar
- Deepak Gupta, Ashish Khanna, Lakshmanaprabu SK, K. Shankar, Vasco Furtado, and Joel J. P. C. Rodrigues. 2019. Efficient artificial fish swarm based clustering approach on mobility aware energy-efficient for MANET. Transactions on Emerging Telecommunications Technologies 30, 9(2019), e3524. https://doi.org/10.1002/ett.3524 arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1002/ett.3524e3524 ETT-18-0306.R1.Google ScholarDigital Library
- Ruba Abu Khurma, Ibrahim Aljarah, Ahmad Sharieh, and Seyedali Mirjalili. 2020. EvoloPy-FS: An Open-Source Nature-Inspired Optimization Framework in Python for Feature Selection. In Evolutionary Machine Learning Techniques: Algorithms and Applications, Seyedali Mirjalili, Hossam Faris, and Ibrahim Aljarah(Eds.). Springer Singapore, Singapore, 131–173. https://doi.org/10.1007/978-981-32-9990-0_8Google Scholar
- Ron Kohavi, George H John, 1997. Wrappers for feature subset selection. Artificial intelligence 97, 1-2 (1997), 273–324.Google Scholar
- Nicolas Labroche, Nicolas Monmarché, and Gilles Venturini. 2003. Visual Clustering with Artificial Ants Colonies. In Knowledge-Based Intelligent Information and Engineering Systems, Vasile Palade, Robert J. Howlett, and Lakhmi Jain(Eds.). Springer Berlin Heidelberg, Berlin, Heidelberg, 332–338.Google Scholar
- Xavier Lurton. 2010. An Introduction to Underwater Acoustics: Principles and Applications (2 ed.). Springer-Verlag, Berlin Heidelberg. https://www.springer.com/gp/book/9783540784807Google Scholar
- Majdi Mafarja and Seyedali Mirjalili. 2018. Whale optimization approaches for wrapper feature selection. Applied Soft Computing 62 (2018), 441 – 453. https://doi.org/10.1016/j.asoc.2017.11.006Google ScholarCross Ref
- Martin Nilsson and Nigel Snoad. 2002. Optimal mutation rates in dynamic environments. Bulletin of Mathematical Biology 64, 6 (Nov. 2002), 1033. https://doi.org/10.1006/bulm.2002.0314Google ScholarCross Ref
- Camelia-M. Pintea and Sorin V. Sabau. 2011. Correlations Involved in a Bio-inspired Classification Technique. Springer Berlin Heidelberg, Berlin, Heidelberg, 239–246. https://doi.org/10.1007/978-3-642-24094-2_17Google Scholar
- Cullen Schaffer. 1993. Overfitting avoidance as bias. Machine Learning 10, 2 (Feb. 1993), 153–178. https://doi.org/10.1007/BF00993504Google ScholarCross Ref
- Marwa Sharawi, Hossam M Zawbaa, and Eid Emary. 2017. Feature selection approach based on whale optimization algorithm. In 2017 Ninth International Conference on Advanced Computational Intelligence (ICACI). IEEE, 163–168.Google ScholarCross Ref
- Mohammad Taradeh, Majdi Mafarja, Ali Asghar Heidari, Hossam Faris, Ibrahim Aljarah, Seyedali Mirjalili, and Hamido Fujita. 2019. An evolutionary gravitational search-based feature selection. Information Sciences 497(2019), 219–239.Google ScholarDigital Library
- Alexandros Tzanetos. 2020. Sonar Inspired Optimization based Feature Selection. https://de.mathworks.com/matlabcentral/fileexchange/78426-sonar-inspired-optimization-based-feature-selection Library Catalog: MATLAB Central File Exchange. Retrieved July 20, 2020.Google Scholar
- Alexandros Tzanetos and Georgios Dounias. 2017. A New Metaheuristic Method for Optimization: Sonar Inspired Optimization. In Engineering Applications of Neural Networks, Giacomo Boracchi, Lazaros Iliadis, Chrisina Jayne, and Aristidis Likas (Eds.). Springer International Publishing, 417–428.Google Scholar
- Alexandros Tzanetos and Georgios Dounias. 2018. Sonar inspired optimization (SIO) in engineering applications. Evolving Systems (2018), 1–9. https://doi.org/10.1007/s12530-018-9250-zGoogle Scholar
- Alexandros Tzanetos and Georgios Dounias. 2020. Sonar Inspired Optimization in Energy Problems Related to Load and Emission Dispatch. In Learning and Intelligent Optimization, Nikolaos F. Matsatsinis, Yannis Marinakis, and Panos Pardalos(Eds.). Springer International Publishing, Cham, 268–283. https://doi.org/10.1007/978-3-030-38629-0_22Google Scholar
- Alexandros Tzanetos, Christos Kyriklidis, Anastasia Papamichail, Apostolis Dimoulakis, and Georgios Dounias. 2018. A Nature Inspired Metaheuristic for Optimal Leveling of Resources in Project Management. In Proceedings of the 10th Hellenic Conference on Artificial Intelligence (Patras, Greece) (SETN ’18). Association for Computing Machinery, New York, NY, USA, Article 17, 7 pages. https://doi.org/10.1145/3200947.3201014Google ScholarDigital Library
- D. W. van der Merwe and A. P. Engelbrecht. 2003. Data clustering using particle swarm optimization. In The 2003 Congress on Evolutionary Computation, 2003. CEC ’03., Vol. 1. 215–220 Vol.1. https://doi.org/10.1109/CEC.2003.1299577Google Scholar
- H. Zhou, Z. Su, and Y. Hao. 2019. Multi-Objective Evolutionary Evidential C-Means Clustering. In 2019 International Conference on Electronic Engineering and Informatics (EEI). 426–429. https://doi.org/10.1109/EEI48997.2019.00098Google Scholar
Recommendations
Wrapper-based optimized feature selection using nature-inspired algorithms
AbstractComputations that mimic nature are known as nature-inspired computing. Nature presents a wealthy source of thoughts and ideas for computing. The use of natural galvanized techniques has been found to provide machine solutions to complex problems. ...
Feature selection using ant colony optimization (ACO): a new method and comparative study in the application of face recognition system
ICDM'07: Proceedings of the 7th industrial conference on Advances in data mining: theoretical aspects and applicationsFeature Selection (FS) and reduction of pattern dimensionality is a most important step in pattern recognition systems. One approach in the feature selection area is employing population-based optimization algorithms such as Genetic Algorithm (GA)-based ...
Image feature selection based on ant colony optimization
AI'11: Proceedings of the 24th international conference on Advances in Artificial IntelligenceImage feature selection (FS) is an important task which can affect the performance of image classification and recognition. In this paper, we present a feature selection algorithm based on ant colony optimization (ACO). For <em>n</em> features, most ACO-...
Comments