Skip to main content

Variance Based Particle Swarm Optimization for Function Optimization and Feature Selection

  • Conference paper
  • First Online:
Swarm, Evolutionary, and Memetic Computing (SEMCCO 2015)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 9873))

Included in the following conference series:

  • 736 Accesses

Abstract

Soft computing based techniques have been widely used in multi-objective optimization problems such as multi-modal function optimization, control and automation, network routing and feature selection etc. Feature Selection (FS) in high dimensional data can be modeled as multi-objective optimization problem to reduce the number of features while improving the overall accuracy. Generally, the traditional local optimization methods may not achieve this twin goal as there are many locally optimal solutions. Recently, various flavors of Particle Swarm Optimization (PSO) have been successfully applied for function optimization. The main issue in these variants of PSO is that it gets stuck in local optimum.

In this paper, we have developed a novel variant of PSO which controls the velocity of particles in a swarm. We have named the proposed method as Variance Particle Swarm Optimization (VPSO) henceforth. In VPSO, the velocity is influenced by the variance of the population. When the variance of the population is high, particles make use of exploitation and vice versa. This reduces the effect of swamping in local optimum. We have validated VPSO method for function optimization and feature selection. Our proposed VPSO method achieves significantly better results against the various PSO methods on eight publicly available benchmark functions optimization and on five publicly available benchmark datasets for feature selection.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    http://www.particleswarm.info/Programs.html.

  2. 2.

    http://code.google.com/p/popot.

  3. 3.

    http://www.sfu.ca/~ssurjano/optimization.html, https://code.google.com/p/popot/.

  4. 4.

    archive.ics.uci.edu/ml/datasets.html.

References

  1. Bekkerman, R., Yaniv, R.E., Tishby, N., Winter, Y.: Distributional word clusters vs. words for text categorization. J. Mach. Learn. Res. 3, 1183–1208 (2003)

    MATH  Google Scholar 

  2. Belew, R.K., McInerney, J., Schraudolph, N.N.: Evolving networks: using the genetic algorithm with connectionist learning. In: Langton, C.G., Taylor, C., Farmer, J.D., Rasmussen, S. (eds.) Artificial Life II, pp. 511–547. Addison-Wesley, Redwood City (1992)

    Google Scholar 

  3. Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth, Belmont (1984)

    MATH  Google Scholar 

  4. Chang, C.C., Lin, C.J.: LIBSVM: A Library for Support Vector Machines (2001). http://www.csie.ntu.edu.tw/~cjlin/libsvm

  5. Chen, G., Zhang, X., Wang, Z.J., Li, F.: An enhanced artificial bee colony-based support vector machine for image-based fault detection. Math. Prob. Eng. 2015, 12 (2015)

    Google Scholar 

  6. Chuang, L.Y., Yang, C.H., Li, J.C.: Chaotic maps based on binary particle swarm optimization for feature selection. Appl. Soft Comput. 11(1), 239–248 (2011)

    Article  Google Scholar 

  7. Dorigo, M., Blum, C.: Ant colony optimization theory: a survey. Theoret. Comput. Sci. 344(23), 243–278 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  8. Forman, G.: An extensive empirical study of feature selection metrics for text classification. J. Mach. Learn. Res. 3, 1289–1305 (2003)

    MATH  Google Scholar 

  9. Frank, A., Asuncion, A.: UCI machine learning repository (2010)

    Google Scholar 

  10. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)

    MATH  Google Scholar 

  11. Huang, C.L.: ACO-based hybrid classification system with feature subset selection and model parameters optimization. Neurocomputing 73(1–3), 438–448 (2009)

    Article  Google Scholar 

  12. Tu, C.-J., Chuang, L.Y., Chang, J.Y., Yang, C.H.: Feature selection using PSO-SVM. IAENG Int. J. Comput. Sci. 33(1), 111–116 (2007)

    Google Scholar 

  13. Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: Proceedings of the IEEE International Conference on Neural Networks, pp. 1942–1948 (1995)

    Google Scholar 

  14. Khanesar, M., Teshnehlab, M., Shoorehdeli, M.: A novel binary particle swarm optimization. In: Mediterranean Conference on Control Automation, MED 2007, pp. 1–6, June 2007

    Google Scholar 

  15. Kohavi, R., Becker, B., Sommerfield, D.: Improving simple Bayes. Silicon Graphics Inc., Mountain View, CA, Technical report, Data Mining and Visualization Group (1997)

    Google Scholar 

  16. Kumar, P.G., Victoire, A.T.A., Renukadevi, P., Devaraj, D.: Design of fuzzy expert system for microarray data classification using a novel genetic swarm algorithm. Expert Syst. Appl. 39(2), 1811–1821 (2012)

    Article  Google Scholar 

  17. Langley, P.: Selection of relevant features in machine learning. In: Proceedings of the AAAI Fall Symposium on Relevance, pp. 140–144. AAAI Press (1994)

    Google Scholar 

  18. Liang, J., Qin, A., Suganthan, P., Baskar, S.: Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans. Evol. Comput. 10(3), 281–295 (2006)

    Article  Google Scholar 

  19. Lynn, N., Suganthan, P.N.: Heterogeneous comprehensive learning particle swarm optimization with enhanced exploration and exploitation. Swarm Evol. Comput. 24, 11–24 (2015)

    Article  Google Scholar 

  20. Mendes, R., Kennedy, J., Neves, J.: The fully informed particle swarm: simpler, maybe better. IEEE Trans. Evol. Comput. 8(3), 204–210 (2004)

    Article  Google Scholar 

  21. Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Mach. Intell. 27, 1226–1238 (2005)

    Article  Google Scholar 

  22. Prasad, Y., Biswas, K.K.: PSO - SVM based classifiers: a comparative approach. In: Ranka, S., Banerjee, A., Biswas, K.K., Dua, S., Mishra, P., Moona, R., Poon, S.-H., Wang, C.-L. (eds.) IC3 2010. CCIS, vol. 94, pp. 241–252. Springer, Heidelberg (2010). doi:10.1007/978-3-642-14834-7_23

    Chapter  Google Scholar 

  23. Prasad, Y., Biswas, K.K., Jain, C.K.: SVM classifier based feature selection using GA, ACO and PSO for siRNA design. In: Tan, Y., Shi, Y., Tan, K.C. (eds.) ICSI 2010. LNCS, vol. 6146, pp. 307–314. Springer, Heidelberg (2010). doi:10.1007/978-3-642-13498-2_40

    Chapter  Google Scholar 

  24. Tan, M., Wang, L., Tsang, I.W.: Learning sparse SVM for feature selection on very high dimensional datasets. In: Proceedings of the Twenty-Seventh International Conference on Machine Learning, pp. 1047–1054 (2010)

    Google Scholar 

  25. Varma, M., Babu, B.R.: More generality in efficient multiple kernel learning. In: Proceedings of the Twenty-Sixth International Conference on Machine Learning, pp. 1065–1072 (2009)

    Google Scholar 

  26. Weston, J., Mukherjee, S., Chapelle, O., Pontil, M., Vapnik, V.: Feature selection for SVMS. In: Advances in Neural Information Processing Systems (NIPS 2013), vol. 13, pp. 668–674 (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yamuna Prasad .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Prasad, Y., Biswas, K.K., Hanmandlu, M., Jain, C.K. (2016). Variance Based Particle Swarm Optimization for Function Optimization and Feature Selection. In: Panigrahi, B., Suganthan, P., Das, S., Satapathy, S. (eds) Swarm, Evolutionary, and Memetic Computing. SEMCCO 2015. Lecture Notes in Computer Science(), vol 9873. Springer, Cham. https://doi.org/10.1007/978-3-319-48959-9_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-48959-9_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-48958-2

  • Online ISBN: 978-3-319-48959-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics