Abstract
The performance of feature selection method is typically measured based on the accuracy and the number of selected features. The use of particle swarm optimization (PSO) as the feature selection method was found to be competitive than its optimization counterpart. However, the standard PSO algorithm suffers from premature convergence, a condition whereby PSO tends to get trapped in a local optimum that prevents it from being converged to a better position. This paper attempts to improve the velocity-based initialization (VBR) method on the feature selection problem using support vector machine classifier following the wrapper method strategy. Five benchmark datasets were used to implement the method. The results were analyzed based on classifier performance and the selected number of features. It was found that on average, the accuracy of the particle swarm optimization with an improved velocity-based initialization method is higher than the existing VBR method and generally generates a lesser number of features.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: Proceedings of IEEE International Conference on Neural Networks, Piscataway, NJ, pp. 1942–(1948)
Clerc, M., Kennedy, J.: The Particle Swarm - Explosion, Stability, and Convergence in a Multidimensional Complex Space. IEEE Transactions on Evolutionary Computation 6, 58–73 (2002)
van den Bergh, F., Engelbrecht, A.P.: Analysis of Particle Swarm Optimizers. Information Sciences 176, 937–971 (2006)
Kadirkamanathan, V., et al.: Stability analysis of the particle dynamics in particle swarm optimizer. IEEE Transactions on Evolutionary Computation 10, 245–255 (2006)
Ratnaweera, A., et al.: Self-organizing hierarchical particle swarm optimizer with time-varying acceleration coefficients. IEEE Transactions on Evolutionary Computation 8, 240–255 (2004)
Pedrycz, W., et al.: Identifying core sets of discriminatory using particle swarm optimization. Expert Systems with Applications 36, 4610–4616 (2009)
Meissner, M., et al.: Optimized Particle Swarm Optimization (OPSO) and its application to artificial neural network training. BMC Bioinformatics 7, 125 (2006)
Wang, X., et al.: Feature selection based on rough sets and particle swarm optimization. Pattern Recognition Letters 28, 459–471 (2007)
Chuang, L.-Y., et al.: Improved binary PSO for feature selection using gene expression data. Computational Biology and Chemistry 32, 29–38 (2008)
Wang, L., Yu, J.: Fault feature selection based on modified binary PSO with mutation and its application in chemical process fault diagnosis. In: Wang, L., Chen, K., S. Ong, Y. (eds.) ICNC 2005. LNCS, vol. 3612, pp. 832–840. Springer, Heidelberg (2005)
Abido, M.A.: Optimal power flow using tabu search algorithm. Electric Power Components & Systems 30, 469–483 (2002)
Binkley, K.J., Hagiwara, M.: Balancing Exploitation and Exploration in Particle Swarm Optimization: Velocity-based Reinitialization, Keio, Japan (2008)
Bae, C., et al.: Feature selection with Intelligent Dynamic Swarm and Rough Set. Expert Systems with Applications 37, 7026–7032 (2010)
Huang, C.-J., et al.: Application of wrapper approach and composite classifier to the stock trend prediction. Expert Systems with Applications 34, 2870–2878 (2008)
Wang, H., et al.: Opposition-based particle swarm algorithm with cauchy mutation. In: IEEE Congres on Evolutionary Computation, pp. 4750–4756 (2007)
Ratnaweera, A., et al.: Self-Organizing Hierarchical Particle Swarm Optimizer With Time-Varying Acceleration Coefficients. IEEE Transactions on Evolutionary Computation 8, 240–255 (2004)
Worasucheep, C.: A particle swarm optimization with stagnation detection and dispersion. In: IEEE Congress on Evolutionary Computation 2008, Hong Kong, China, pp. 424–429 (2008)
Wang, H., et al.: Opposition-based particle swarm algorithm with cauchy mutation. In: Proceedings of the 2007 IEEE Congress on Evolutionary Computation, pp. 4750–4756 (2007)
Higashi, N., Iba, H.: Particle swarm optimization with gaussian mutations. In: Proceedings of the IEEE Swarm Intelligence Symposium, pp. 72–79 (2003)
Tizhoosh, H.R.: Opposition-based learning: A new schema for machine intelligence. In: International Conference on Computational Intelligence for Modeling Control and Automation (CIMCA 2005), Vienna, Austria (2005)
Pasupuleti, S., Battiti, R.: The gregarious particle swarm optimizer (G-PSO). In: Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, pp. 67–74 (2006)
Yang, C.-S., et al.: A Hybrid Feature Selection Method for Microarracy Classification. IAENG International Journal of Computer Science 35 (2008)
Zhu, Z., et al.: Wrapper–Filter Feature Selection Algorithm Using a Memetic Framework. IEEE Transactions on Systems, Man, and Cybernetics-Part B 37, 70–76 (2007)
Yu, L., Liu, H.: Efficient Feature Selection via Analysis of Relevance and Redundancy. Machine Learning Research 5, 1205–1224 (2004)
Lin, S.-W., et al.: Particle swarm optimization for parameter determination and feature selection of support vector machines. Expert Systems with Applications 35, 1817–1824 (2008)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer Science+Business Media Singapore
About this paper
Cite this paper
Abdul-Rahman, S., Bakar, A.A., Mohamed-Hussein, ZA. (2015). An Improved Particle Swarm Optimization via Velocity-Based Reinitialization for Feature Selection. In: Berry, M., Mohamed, A., Yap, B. (eds) Soft Computing in Data Science. SCDS 2015. Communications in Computer and Information Science, vol 545. Springer, Singapore. https://doi.org/10.1007/978-981-287-936-3_1
Download citation
DOI: https://doi.org/10.1007/978-981-287-936-3_1
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-287-935-6
Online ISBN: 978-981-287-936-3
eBook Packages: Computer ScienceComputer Science (R0)