Skip to main content

Feature Selection Using a Reinforcement-Behaved Brain Storm Optimization

  • Conference paper
  • First Online:
Intelligent Computing Methodologies (ICIC 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11645))

Included in the following conference series:

  • 1954 Accesses

Abstract

In this era of data explosion, feature selection has received sustained attention to remove the large amounts of meaningless data and improve the classification ac-curacy rate. In this paper, a feature selection method based on reinforcement-behaved strategy is proposed, which identifies the most important features by embedding the Brain Storm Optimization (BSO) algorithm into the classifier. The ideas of the BSO are mapped to feature subsets, and the importance of the feature is evaluated through some indicators, i.e. the validity of the feature migration. In the migration of each feature, the feature is updated to a new feature in the same position between the two generations. The feedback of each action is used as the basis for the ordering of feature importance. An updating strategy is presented to modify the actions based on the current state to improve the feature set. The effectiveness of the proposed algorithm has been demonstrated on six different binary classification datasets (e.g., biometrics, geography, etc.) in comparison to several embedded methods. The results show that our proposed method is superior in high performance, stability and low computing costs.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bishop, C.: Pattern Recognition and Machine Learning, 1st edn. Springer, New York (2006). https://doi.org/10.1007/978-1-4615-7566-5

    Book  MATH  Google Scholar 

  2. Friedman, J., Hastie, T., Tibshirani, R.: The Elements of Statistical Learning: Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2009). https://doi.org/10.1007/BF02985802

    Book  MATH  Google Scholar 

  3. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)

    MATH  Google Scholar 

  4. Dash, M., Liu, H.: Feature selection for classification. Intell. Data Anal. 1(1–4), 131–156 (1997)

    Article  Google Scholar 

  5. Sun, Z., Bebis, G., Miller, R.: Object detection using feature subset selection. Pattern Recogn. 37(11), 2165–2176 (2004)

    Article  Google Scholar 

  6. Zhang, Y., Wang, S., Phillips, P., Ji, G.: Binary PSO with mutation operator for feature selection using decision tree applied to spam detection. Knowl.-Based Syst. 64, 22–31 (2014)

    Article  Google Scholar 

  7. Gu, S., Cheng, R., Jin, Y.: Feature selection for high-dimensional classification using a competitive swarm optimizer. Soft. Comput. 22(3), 811–822 (2018)

    Article  Google Scholar 

  8. Hafiz, F., Swain, A., Patel, N., Naik, C.: A two-dimensional (2-D) learning framework for particle swarm based feature selection. Pattern Recogn. 76, 416–433 (2018)

    Article  Google Scholar 

  9. Tran, B., Xue, B., Zhang, M.: Variable-length particle swarm optimisation for feature selection on high-dimensional classification. IEEE Trans. Evol. Comput. 23(3), 473–487 (2018)

    Article  Google Scholar 

  10. Ghamisi, P., Benediktsson, J.A.: Feature selection based on hybridization of genetic algorithm and particle swarm optimization. IEEE Geosci. Remote Sens. Lett. 12(2), 309–313 (2015)

    Article  Google Scholar 

  11. Das, A.K., Das, S., Ghosh, A.: Ensemble feature selection using bi-objective genetic algorithm. Knowl.-Based Syst. 123, 116–127 (2017)

    Article  Google Scholar 

  12. Chen, Y.P., et al.: A novel bacterial foraging optimization algorithm for feature selection. Expert Syst. Appl. 83, 1–17 (2017)

    Article  Google Scholar 

  13. Wang, H., Jing, X., Niu, B.: A discrete bacterial algorithm for feature selection in classification of microarray gene expression cancer data. Knowl.-Based Syst. 126, 8–19 (2017)

    Article  Google Scholar 

  14. Kashef, S., Nezamabadi-pour, H.: An advanced ACO algorithm for feature subset selection. Neurocomputing 147, 271–279 (2015)

    Article  Google Scholar 

  15. Moradi, P., Rostami, M.: Integration of graph clustering with ant colony optimization for feature selection. Knowl.-Based Syst. 84, 144–161 (2015)

    Article  Google Scholar 

  16. Shi, Y.: Brain storm optimization algorithm. In: Tan, Y., Shi, Y., Chai, Y., Wang, G. (eds.) ICSI 2011. LNCS, vol. 6728, pp. 303–309. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-21515-5_36

    Chapter  Google Scholar 

  17. Sokolova, M., Lapalme, G.: A systematic analysis of performance measures for classification tasks. Inf. Process. Manag. 45(4), 427–437 (2009)

    Article  Google Scholar 

  18. Sun, C., Duan, H., Shi, Y.: Optimal satellite formation reconfiguration based on closed-loop brain storm optimization. IEEE Comput. Intell. Mag. 8(4), 39–51 (2013)

    Article  Google Scholar 

  19. Zhan, Z., Zhang, J., Shi, Y., Liu, H.: A modified brain storm optimization. In: IEEE Congress on Evolutionary Computation, pp. 1–8 (2012)

    Google Scholar 

  20. Kennedy, J., Eberhart, R.: Particle swarm optimization. Int. Conf. Neural Netw. 4, 1942–1948 (1995)

    Google Scholar 

  21. UCI Machine Learning Repository. http://archive.ics.uci.edu/ml/index.php. Accessed 15 May 2019

Download references

Acknowledgment

This work is partially supported by The National Natural Science Foundation of China (Grants Nos. 71571120, 71271140, 71471158, 71001072, 61472257), Natural Science Foundation of Guangdong Province (2016A030310074, 2018A030310575), Innovation and Entrepreneurship Research Center of Guangdong University Student (2018A073825), Shenzhen Science and Technology Plan (CXZZ20140418182638764), Research Foundation of Shenzhen University (85303/00000155), Research Cultivation Project from Shenzhen Institute of Information Technology (ZY201717), and Innovating and Upgrading Institute Project from Department of Education of Guangdong Province (2017GWTSCX038).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hong Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Niu, B., Yang, X., Wang, H. (2019). Feature Selection Using a Reinforcement-Behaved Brain Storm Optimization. In: Huang, DS., Huang, ZK., Hussain, A. (eds) Intelligent Computing Methodologies. ICIC 2019. Lecture Notes in Computer Science(), vol 11645. Springer, Cham. https://doi.org/10.1007/978-3-030-26766-7_61

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-26766-7_61

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-26765-0

  • Online ISBN: 978-3-030-26766-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics