skip to main content
10.1145/2330163.2330175acmconferencesArticle/Chapter ViewAbstractPublication PagesgeccoConference Proceedingsconference-collections
research-article

Multi-objective particle swarm optimisation (PSO) for feature selection

Authors Info & Claims
Published:07 July 2012Publication History

ABSTRACT

Feature selection (FS) is an important data preprocessing technique, which has two goals of minimising the classification error and minimising the number of features selected. Based on particle swarm optimisation (PSO), this paper proposes two multi-objective algorithms for selecting the Pareto front of non-dominated solutions (feature subsets) for classification. The first algorithm introduces the idea of non-dominated sorting based multi-objective genetic algorithm II into PSO for FS. In the second algorithm, multi-objective PSO uses the ideas of crowding, mutation and dominance to search for the Pareto front solutions. The two algorithms are compared with two single objective FS methods and a conventional FS method on nine datasets. Experimental results show that both proposed algorithms can automatically evolve a smaller number of features and achieve better classification performance than using all features and feature subsets obtained from the two single objective methods and the conventional method. Both the continuous and the binary versions of PSO are investigated in the two proposed algorithms and the results show that continuous version generally achieves better performance than the binary version. The second new algorithm outperforms the first algorithm in both continuous and binary versions.

References

  1. H. Almuallim and T. G. Dietterich. Learning boolean concepts in the presence of many irrelevant features. Artificial Intelligence, 69:279--305, 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. G. Azevedo, G. Cavalcanti, and E. Filho. An approach to feature selection for keystroke dynamics systems based on pso and feature weighting. In IEEE Congress on Evolutionary Computation (CEC'07), pages 3577--3584, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  3. B. Chakraborty. Genetic algorithm with fuzzy fitness function for feature selection. In IEEE International Symposium on Industrial Electronics (ISIE'02), volume 1, pages 315--319, 2002.Google ScholarGoogle ScholarCross RefCross Ref
  4. B. Chakraborty. Feature subset selection by particle swarm optimization with fuzzy fitness function. In 3rd International Conference on Intelligent System and Knowledge Engineering (ISKE'08), volume 1, pages 1038--1042, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  5. M. Dash and H. Liu. Feature selection for classification. Intelligent Data Analysis, 1(4):131--156, 1997.Google ScholarGoogle ScholarCross RefCross Ref
  6. K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan. A fast and elitist multiobjective genetic algorithm: Nsga-ii. IEEE Transactions on Evolutionary Computation, 6(2):182--197, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. A. Frank and A. Asuncion. UCI machine learning repository, 2010.Google ScholarGoogle Scholar
  8. M. Gutlein, E. Frank, M. Hall, and A. Karwath. Large-scale attribute selection using wrappers. In IEEE Symposium on Computational Intelligence and Data Mining (CIDM '09), pages 332--339, 2009.Google ScholarGoogle ScholarCross RefCross Ref
  9. C. L. Huang and J. F. Dun. A distributed pso-svm hybrid system with feature selection and parameter optimization. Application on Soft Computing, 8:1381--1391, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. J. Kennedy and R. Eberhart. Particle swarm optimization. In IEEE International Conference on Neural Networks, volume 4, pages 1942--1948, 1995.Google ScholarGoogle ScholarCross RefCross Ref
  11. J. Kennedy and R. Eberhart. A discrete binary version of the particle swarm algorithm. In IEEE International Conference on Systems, Man, and Cybernetics, 1997. Computational Cybernetics and Simulation., volume 5, pages 4104--4108, 1997.Google ScholarGoogle ScholarCross RefCross Ref
  12. J. Kennedy and W. Spears. Matching algorithms to problems: an experimental test of the particle swarm and some genetic algorithms on the multimodal problem generator. In IEEE Congress on Evolutionary Computation (CEC'98), pages 78--83, 1998.Google ScholarGoogle ScholarCross RefCross Ref
  13. K. Kira and L. A. Rendell. A practical approach to feature selection. Assorted Conferences and Workshops, pages 249--256, 1992. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. R. Kohavi and G. H. John. Wrappers for feature subset selection. Artificial Intelligence, 97:273--324, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. X. Li. A non-dominated sorting particle swarm optimizer for multiobjective optimization. In GECCO, pages 37--48, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Y. Liu, G. Wang, H. Chen, and H. Dong. An improved particle swarm optimization for feature selection. Journal of Bionic Engineering, 8(2):191--200, 2011.Google ScholarGoogle ScholarCross RefCross Ref
  17. T. Marill and D. Green. On the effectiveness of receptors in recognition systems. IEEE Transactions on Information Theory, 9(1):11--17, 1963. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. H. Ming. A rough set based hybrid method to feature selection. In International Symposium on Knowledge Acquisition and Modeling (KAM '08), pages 585--588, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. A. Mohemmed, M. Zhang, and M. Johnston. Particle swarm optimization based adaboost for face detection. In IEEE Congress on Evolutionary Computation (CEC'09), pages 2494--2501, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. K. Neshatian and M. Zhang. Dimensionality reduction in face detection: A genetic programming approach. In 24th International Conference Image and Vision Computing New Zealand (IVCNZ'09), pages 391--396, 2009.Google ScholarGoogle ScholarCross RefCross Ref
  21. Y. Shi and R. Eberhart. A modified particle swarm optimizer. In IEEE International Conference on Evolutionary Computation (CEC'98), pages 69--73, 1998.Google ScholarGoogle ScholarCross RefCross Ref
  22. M. R. Sierra and C. A. C. Coello. Improving pso-based multi-objective optimization using crowding, mutation and epsilon-dominance. In EMO, pages 505--519, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. A. Unler and A. Murat. A discrete particle swarm optimization method for feature selection in binary classification problems. European Journal of Operational Research, 206(3):528--539, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  24. F. Van Den Bergh. An analysis of particle swarm optimizers. PhD thesis, Pretoria, South Africa, 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. A. Whitney. A direct method of nonparametric measurement selection. IEEE Transactions on Computers, C-20(9):1100--1103, 1971. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. C. S. Yang, L. Y. Chuang, C. H. Ke, and C. H. Yang. Boolean binary particle swarm optimization for feature selection. In IEEE Congress on Evolutionary Computation (CEC'08), pages 2093--2098, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  27. H. Yuan, S. S. Tseng, and W. Gangshan. A two-phase feature selection method using both filter and wrapper. In IEEE International Conference on Systems, Man, and Cybernetics (SMC'99), volume 2, pages 132--136, 1999.Google ScholarGoogle Scholar

Index Terms

  1. Multi-objective particle swarm optimisation (PSO) for feature selection

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      GECCO '12: Proceedings of the 14th annual conference on Genetic and evolutionary computation
      July 2012
      1396 pages
      ISBN:9781450311779
      DOI:10.1145/2330163

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 7 July 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate1,669of4,410submissions,38%

      Upcoming Conference

      GECCO '24
      Genetic and Evolutionary Computation Conference
      July 14 - 18, 2024
      Melbourne , VIC , Australia

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader