Abstract
In this era of data explosion, feature selection has received sustained attention to remove the large amounts of meaningless data and improve the classification ac-curacy rate. In this paper, a feature selection method based on reinforcement-behaved strategy is proposed, which identifies the most important features by embedding the Brain Storm Optimization (BSO) algorithm into the classifier. The ideas of the BSO are mapped to feature subsets, and the importance of the feature is evaluated through some indicators, i.e. the validity of the feature migration. In the migration of each feature, the feature is updated to a new feature in the same position between the two generations. The feedback of each action is used as the basis for the ordering of feature importance. An updating strategy is presented to modify the actions based on the current state to improve the feature set. The effectiveness of the proposed algorithm has been demonstrated on six different binary classification datasets (e.g., biometrics, geography, etc.) in comparison to several embedded methods. The results show that our proposed method is superior in high performance, stability and low computing costs.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Bishop, C.: Pattern Recognition and Machine Learning, 1st edn. Springer, New York (2006). https://doi.org/10.1007/978-1-4615-7566-5
Friedman, J., Hastie, T., Tibshirani, R.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009). https://doi.org/10.1007/BF02985802
Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)
Dash, M., Liu, H.: Feature selection for classification. Intell. Data Anal. 1(1–4), 131–156 (1997)
Sun, Z., Bebis, G., Miller, R.: Object detection using feature subset selection. Pattern Recogn. 37(11), 2165–2176 (2004)
Zhang, Y., Wang, S., Phillips, P., Ji, G.: Binary PSO with mutation operator for feature selection using decision tree applied to spam detection. Knowl.-Based Syst. 64, 22–31 (2014)
Gu, S., Cheng, R., Jin, Y.: Feature selection for high-dimensional classification using a competitive swarm optimizer. Soft. Comput. 22(3), 811–822 (2018)
Hafiz, F., Swain, A., Patel, N., Naik, C.: A two-dimensional (2-D) learning framework for particle swarm based feature selection. Pattern Recogn. 76, 416–433 (2018)
Tran, B., Xue, B., Zhang, M.: Variable-length particle swarm optimisation for feature selection on high-dimensional classification. IEEE Trans. Evol. Comput. 23(3), 473–487 (2018)
Ghamisi, P., Benediktsson, J.A.: Feature selection based on hybridization of genetic algorithm and particle swarm optimization. IEEE Geosci. Remote Sens. Lett. 12(2), 309–313 (2015)
Das, A.K., Das, S., Ghosh, A.: Ensemble feature selection using bi-objective genetic algorithm. Knowl.-Based Syst. 123, 116–127 (2017)
Chen, Y.P., et al.: A novel bacterial foraging optimization algorithm for feature selection. Expert Syst. Appl. 83, 1–17 (2017)
Wang, H., Jing, X., Niu, B.: A discrete bacterial algorithm for feature selection in classification of microarray gene expression cancer data. Knowl.-Based Syst. 126, 8–19 (2017)
Kashef, S., Nezamabadi-pour, H.: An advanced ACO algorithm for feature subset selection. Neurocomputing 147, 271–279 (2015)
Moradi, P., Rostami, M.: Integration of graph clustering with ant colony optimization for feature selection. Knowl.-Based Syst. 84, 144–161 (2015)
Shi, Y.: Brain storm optimization algorithm. In: Tan, Y., Shi, Y., Chai, Y., Wang, G. (eds.) ICSI 2011. LNCS, vol. 6728, pp. 303–309. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-21515-5_36
Sokolova, M., Lapalme, G.: A systematic analysis of performance measures for classification tasks. Inf. Process. Manag. 45(4), 427–437 (2009)
Sun, C., Duan, H., Shi, Y.: Optimal satellite formation reconfiguration based on closed-loop brain storm optimization. IEEE Comput. Intell. Mag. 8(4), 39–51 (2013)
Zhan, Z., Zhang, J., Shi, Y., Liu, H.: A modified brain storm optimization. In: IEEE Congress on Evolutionary Computation, pp. 1–8 (2012)
Kennedy, J., Eberhart, R.: Particle swarm optimization. Int. Conf. Neural Netw. 4, 1942–1948 (1995)
UCI Machine Learning Repository. http://archive.ics.uci.edu/ml/index.php. Accessed 15 May 2019
Acknowledgment
This work is partially supported by The National Natural Science Foundation of China (Grants Nos. 71571120, 71271140, 71471158, 71001072, 61472257), Natural Science Foundation of Guangdong Province (2016A030310074, 2018A030310575), Innovation and Entrepreneurship Research Center of Guangdong University Student (2018A073825), Shenzhen Science and Technology Plan (CXZZ20140418182638764), Research Foundation of Shenzhen University (85303/00000155), Research Cultivation Project from Shenzhen Institute of Information Technology (ZY201717), and Innovating and Upgrading Institute Project from Department of Education of Guangdong Province (2017GWTSCX038).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Niu, B., Yang, X., Wang, H. (2019). Feature Selection Using a Reinforcement-Behaved Brain Storm Optimization. In: Huang, DS., Huang, ZK., Hussain, A. (eds) Intelligent Computing Methodologies. ICIC 2019. Lecture Notes in Computer Science(), vol 11645. Springer, Cham. https://doi.org/10.1007/978-3-030-26766-7_61
Download citation
DOI: https://doi.org/10.1007/978-3-030-26766-7_61
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-26765-0
Online ISBN: 978-3-030-26766-7
eBook Packages: Computer ScienceComputer Science (R0)