Skip to main content

An Improved Particle Swarm Optimization via Velocity-Based Reinitialization for Feature Selection

  • Conference paper
  • First Online:
Soft Computing in Data Science (SCDS 2015)

Abstract

The performance of feature selection method is typically measured based on the accuracy and the number of selected features. The use of particle swarm optimization (PSO) as the feature selection method was found to be competitive than its optimization counterpart. However, the standard PSO algorithm suffers from premature convergence, a condition whereby PSO tends to get trapped in a local optimum that prevents it from being converged to a better position. This paper attempts to improve the velocity-based initialization (VBR) method on the feature selection problem using support vector machine classifier following the wrapper method strategy. Five benchmark datasets were used to implement the method. The results were analyzed based on classifier performance and the selected number of features. It was found that on average, the accuracy of the particle swarm optimization with an improved velocity-based initialization method is higher than the existing VBR method and generally generates a lesser number of features.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: Proceedings of IEEE International Conference on Neural Networks, Piscataway, NJ, pp. 1942–(1948)

    Google Scholar 

  2. Clerc, M., Kennedy, J.: The Particle Swarm - Explosion, Stability, and Convergence in a Multidimensional Complex Space. IEEE Transactions on Evolutionary Computation 6, 58–73 (2002)

    Article  Google Scholar 

  3. van den Bergh, F., Engelbrecht, A.P.: Analysis of Particle Swarm Optimizers. Information Sciences 176, 937–971 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  4. Kadirkamanathan, V., et al.: Stability analysis of the particle dynamics in particle swarm optimizer. IEEE Transactions on Evolutionary Computation 10, 245–255 (2006)

    Article  Google Scholar 

  5. Ratnaweera, A., et al.: Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients. IEEE Transactions on Evolutionary Computation 8, 240–255 (2004)

    Article  Google Scholar 

  6. Pedrycz, W., et al.: Identifying core sets of discriminatory using particle swarm optimization. Expert Systems with Applications 36, 4610–4616 (2009)

    Article  Google Scholar 

  7. Meissner, M., et al.: Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training. BMC Bioinformatics 7, 125 (2006)

    Article  Google Scholar 

  8. Wang, X., et al.: Feature selection based on rough sets and particle swarm optimization. Pattern Recognition Letters 28, 459–471 (2007)

    Article  Google Scholar 

  9. Chuang, L.-Y., et al.: Improved binary PSO for feature selection using gene expression data. Computational Biology and Chemistry 32, 29–38 (2008)

    Article  MATH  Google Scholar 

  10. Wang, L., Yu, J.: Fault feature selection based on modified binary PSO with mutation and its application in chemical process fault diagnosis. In: Wang, L., Chen, K., S. Ong, Y. (eds.) ICNC 2005. LNCS, vol. 3612, pp. 832–840. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  11. Abido, M.A.: Optimal power flow using tabu search algorithm. Electric Power Components & Systems 30, 469–483 (2002)

    Article  Google Scholar 

  12. Binkley, K.J., Hagiwara, M.: Balancing Exploitation and Exploration in Particle Swarm Optimization: Velocity-based Reinitialization, Keio, Japan (2008)

    Google Scholar 

  13. Bae, C., et al.: Feature selection with Intelligent Dynamic Swarm and Rough Set. Expert Systems with Applications 37, 7026–7032 (2010)

    Article  Google Scholar 

  14. Huang, C.-J., et al.: Application of wrapper approach and composite classifier to the stock trend prediction. Expert Systems with Applications 34, 2870–2878 (2008)

    Article  Google Scholar 

  15. Wang, H., et al.: Opposition-based particle swarm algorithm with cauchy mutation. In: IEEE Congres on Evolutionary Computation, pp. 4750–4756 (2007)

    Google Scholar 

  16. Ratnaweera, A., et al.: Self-Organizing Hierarchical Particle Swarm Optimizer With Time-Varying Acceleration Coefficients. IEEE Transactions on Evolutionary Computation 8, 240–255 (2004)

    Article  Google Scholar 

  17. Worasucheep, C.: A particle swarm optimization with stagnation detection and dispersion. In: IEEE Congress on Evolutionary Computation 2008, Hong Kong, China, pp. 424–429 (2008)

    Google Scholar 

  18. Wang, H., et al.: Opposition-based particle swarm algorithm with cauchy mutation. In: Proceedings of the 2007 IEEE Congress on Evolutionary Computation, pp. 4750–4756 (2007)

    Google Scholar 

  19. Higashi, N., Iba, H.: Particle swarm optimization with gaussian mutations. In: Proceedings of the IEEE Swarm Intelligence Symposium, pp. 72–79 (2003)

    Google Scholar 

  20. Tizhoosh, H.R.: Opposition-based learning: A new schema for machine intelligence. In: International Conference on Computational Intelligence for Modeling Control and Automation (CIMCA 2005), Vienna, Austria (2005)

    Google Scholar 

  21. Pasupuleti, S., Battiti, R.: The gregarious particle swarm optimizer (G-PSO). In: Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, pp. 67–74 (2006)

    Google Scholar 

  22. Yang, C.-S., et al.: A Hybrid Feature Selection Method for Microarracy Classification. IAENG International Journal of Computer Science 35 (2008)

    Google Scholar 

  23. Zhu, Z., et al.: Wrapper–Filter Feature Selection Algorithm Using a Memetic Framework. IEEE Transactions on Systems, Man, and Cybernetics-Part B 37, 70–76 (2007)

    Article  Google Scholar 

  24. Yu, L., Liu, H.: Efficient Feature Selection via Analysis of Relevance and Redundancy. Machine Learning Research 5, 1205–1224 (2004)

    MathSciNet  MATH  Google Scholar 

  25. Lin, S.-W., et al.: Particle swarm optimization for parameter determination and feature selection of support vector machines. Expert Systems with Applications 35, 1817–1824 (2008)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shuzlina Abdul-Rahman .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer Science+Business Media Singapore

About this paper

Cite this paper

Abdul-Rahman, S., Bakar, A.A., Mohamed-Hussein, ZA. (2015). An Improved Particle Swarm Optimization via Velocity-Based Reinitialization for Feature Selection. In: Berry, M., Mohamed, A., Yap, B. (eds) Soft Computing in Data Science. SCDS 2015. Communications in Computer and Information Science, vol 545. Springer, Singapore. https://doi.org/10.1007/978-981-287-936-3_1

Download citation

  • DOI: https://doi.org/10.1007/978-981-287-936-3_1

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-287-935-6

  • Online ISBN: 978-981-287-936-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics