Abstract
Feature selection algorithm explores the data to eliminate noisy, irrelevant, redundant data, and simultaneously optimize the classification performance. In this paper, a classification accuracy-based fitness function is proposed by gray-wolf optimizer to find optimal feature subset. Gray-wolf optimizer is a new evolutionary computation technique which mimics the leadership hierarchy and hunting mechanism of gray wolves in nature. The aim of the gray wolf optimization is find optimal regions of the complex search space through the interaction of individuals in the population. Compared with particle swarm optimization (PSP) and Genetic Algorithms (GA) over a set of UCI machine learning data repository, the proposed approach proves better performance in both classification accuracy and feature size reduction. Moreover, the gray wolf optimization approach proves much robustness against initialization in comparison with PSO and GA optimizers.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Yang, C.-H., Tu, C.-J., Chang, J.-Y., Liu, H.-H., Ko, P.-C.: Dimensionality Reduction using GA-PSO. In: Proceedings of the Joint Conference on Information Sciences (JCIS), October 8-11. Atlantis Press, Kaohsiung (2006)
Cannas, L.M.: ’A framework for feature selection in high-dimensional domains. Ph.D. Thesis, University of Cagliari (2012)
Dash, M., Liu, H.: Feature selection for Classification. Intelligent Data Analysis 1(3), 131–156 (1997)
Jensen, R.: Combining rough and fuzzy sets for feature selection. Ph.D. Thesis, University of Edinburgh (2005)
Liu, H., Motoda, H.: Feature Selection for Knowledge Discovery and Data Mining. Kluwer, Boston (1998)
Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey Wolf Optimizer. Adv. Eng. Softw. 69, 46–61 (2014)
Zhong, N., Dong, J.Z.: Using rough sets with heuristics for feature selection. J. Intell. Inform. Systems 16, 199–214 (2001)
Raymer, M.L., Punch, W.E., Goodman, E.D., et al.: Dimensionality reduction using genetic algorithms. IEEE Trans. Evol. Comput. 4(2), 164–171 (2000)
Lai, C., Reinders, M.J.T., Wessels, L.: Random subspace method for multivariate feature selection. Pattern Recognition Lett. 27, 1067–1076 (2006)
Kohavi, R.: Feature subset selection using the wrapper method, Overfitting and dynamic search space topology. In: Proc. AAAI Fall Symposium on Relevance, pp. 109–113 (1994)
Gasca, E., Sanchez, J.S., Alonso, R.: Eliminating redundancy and irrelevance using a new MLP-based feature selection method. Pattern Recognition 39(2), 313–315 (2006)
Neumann, J., Schnorr, C., Steidl, G.: Combined SVM-based feature selection and classification. Machine Learning 61, 129–150 (2005)
Kira, K., Rendell, L.A.: The feature selection problem: Traditional methods and a new algorithm. In: Proc. AAAI 1992, San Jose, CA, pp. 129–134 (1992)
Modrzejewski, M.: Feature selection using rough sets theory. In: Proceedings of the European Conference on Machine Learning, Vienna, Austria, pp. 213–226 (1993)
Dash, M., Liu, H.: Consistency-based search in feature selection. Artif. Intell. 151, 155–176 (2003)
Guyon, I., Weston, J., Barnhill, S., Vapnik, V.: Gene selection for cancer classification using support vector machines. Machine Learning 46, 389–422 (2002)
Xie, Z.-X., Hu, Q.-H., Yu, D.-R.: Improved feature selection algorithm based on SVM and correlation. In: Wang, J., Yi, Z., Żurada, J.M., Lu, B.-L., Yin, H. (eds.) ISNN 2006. LNCS, vol. 3971, pp. 1373–1380. Springer, Heidelberg (2006)
Yao, Y.Y.: Information-theoretic measures for knowledge discovery and data mining. In: Karmeshu (ed.) Entropy Measures, Maximum Entropy and Emerging Applications, pp. 115–136. Springer, Berlin (2003)
Deogun, J.S., Raghavan, V.V., Sever, H.: Rough set based classication methods and extended decision tables. In: Proc. of the Int. Workshop on Rough Sets and Soft Computing, pp. 302–309 (1994)
Zhang, M., Yao, J.T.: A rough sets based approach to feature selection. In: IEEE Annual Meeting of the Fuzzy Information, Processing NAFIPS 2004, June 27-30, vol. 1, pp. 434–439 (2004)
Hu, X.: Knowledge discovery in databases: an attribute-oriented rough set approach. PhD thesis, University of Regina, Canada (1995)
Blackwell, T., Branke, J.: Multiswarms, “exclusion, and anti-convergence in dynamic environments”. IEEE Transactions on Evolutionary Computation 10, 459–472 (2006)
Parrott, D., Li, X.D.: Locating and tracking multiple dynamic optima by a particle swarm model using speciation. IEEE Transactions on Evolutionary Computation 10, 440–458 (2006)
Yang, X.-S.: A New Metaheuristic Bat-Inspired Algorithm. In: González, J.R., Pelta, D.A., Cruz, C., Terrazas, G., Krasnogor, N. (eds.) NICSO 2010. SCI, vol. 284, pp. 65–74. Springer, Heidelberg (2010)
Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey Wolf Optimizer. Advances in Engineering Software 69, 46–61 (2014)
Xue, B., Zhang, M., Browne, W.N.: Particle swarm optimisation for feature selection in classification:Novel initialisation and updating mechanisms. Applied Soft Computing 18, 261–276 (2014)
Bache, K., Lichman, M.: UCI Machine Learning Repository. University of California, School of Information and Computer Science, Irvine (2013), http://archive.ics.uci.edu/ml
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Emary, E., Zawbaa, H.M., Grosan, C., Hassenian, A.E. (2015). Feature Subset Selection Approach by Gray-Wolf Optimization. In: Abraham, A., Krömer, P., Snasel, V. (eds) Afro-European Conference for Industrial Advancement. Advances in Intelligent Systems and Computing, vol 334. Springer, Cham. https://doi.org/10.1007/978-3-319-13572-4_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-13572-4_1
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-13571-7
Online ISBN: 978-3-319-13572-4
eBook Packages: EngineeringEngineering (R0)