Abstract
Feature selection is an essential preprocessing step in data mining and machine learning. A feature selection task can be treated as a multi-objective optimization problem which simultaneously minimizes the classification error and the number of selected features. Many existing feature selection approaches including multi-objective methods neglect that there exists multiple optimal solutions in feature selection. There can be multiple different optimal feature subsets which achieve the same or similar classification performance. Furthermore, when using evolutionary multi-objective optimization for feature selection, a crowding distance metric is typically used to play a role in environmental selection. However, some existing calculations of crowding metrics based on continuous/numeric values are inappropriate for feature selection since the search space of feature selection is discrete. Therefore, this paper proposes a new environmental selection method to modify the calculation of crowding metrics. The proposed approach is expected to help a multi-objective feature selection algorithm to find multiple potential optimal feature subsets. Experiments on sixteen different datasets of varying difficulty show that the proposed approach can find more diverse feature subsets, achieving the same classification performance without deteriorating performance regarding hypervolume and inverted generational distance.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Coello, C.C., Lechuga, M.S.: Mopso: A proposal for multiple objective particle swarm optimization. In: 2002 IEEE Congress on Evolutionary Computation, vol. 2, pp. 1051–1056 (2002)
Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 13(1), 21–27 (1967)
Deb, K., Agrawal, S., Pratap, A., Meyarivan, T.: A fast elitist non-dominated sorting genetic algorithm for multi-objective optimization: NSGA-II. In: Schoenauer, M. (ed.) PPSN 2000. LNCS, vol. 1917, pp. 849–858. Springer, Heidelberg (2000). https://doi.org/10.1007/3-540-45356-3_83
Dua, D., Graff, C.: UCI machine learning repository (2017). http://archive.ics.uci.edu/ml
Hamdani, T.M., Won, J.-M., Alimi, A.M., Karray, F.: Multi-objective feature selection with NSGA II. In: Beliczynski, B., Dzielinski, A., Iwanowski, M., Ribeiro, B. (eds.) ICANNGA 2007. LNCS, vol. 4431, pp. 240–247. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-71618-1_27
Harik, G.R., Lobo, F.G., Goldberg, D.E.: The compact genetic algorithm. IEEE Trans. Evol. Comput. 3(4), 287–297 (1999)
He, Z., Yen, G.G.: Many-objective evolutionary algorithms based on coordinated selection strategy. IEEE Trans. Evol. Comput. 21(2), 220–233 (2016)
Ishibuchi, H., Masuda, H., Tanigaki, Y., Nojima, Y.: Modified distance calculation in generational distance and inverted generational distance. In: Gaspar-Cunha, A., Henggeler Antunes, C., Coello, C.C. (eds.) EMO 2015. LNCS, vol. 9019, pp. 110–125. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-15892-1_8
Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of ICNN 1995-International Conference on Neural Networks, vol. 4, pp. 1942–1948. IEEE (1995)
Liu, H., Motoda, H.: Feature Extraction, Construction and Selection: A Data Mining Perspective, vol. 453, Springer, Boston (1998)
Liu, H., Motoda, H., Setiono, R., Zhao, Z.: Feature selection: an ever evolving frontier in data mining. In: Feature Selection in Data Mining, pp. 4–13 (2010)
Liu, Y., Gong, D., Sun, J., Jin, Y.: A many-objective evolutionary algorithm using a one-by-one selection strategy. IEEE Trans. Cybern. 47(9), 2689–2702 (2017)
Nguyen, B.H., Xue, B., Andreae, P., Ishibuchi, H., Zhang, M.: Multiple reference points-based decomposition for multiobjective feature selection in classification: static and dynamic mechanisms. IEEE Trans. Evol. Comput. 24(1), 170–184 (2019)
Nguyen, H.B., Xue, B., Andreae, P., Zhang, M.: Particle swarm optimisation with genetic operators for feature selection. In: 2017 IEEE Congress on Evolutionary Computation, pp. 286–293 (2017)
Pudil, P., Novovičová, J., Kittler, J.: Floating search methods in feature selection. Pattern Recogn. Lett. 15(11), 1119–1125 (1994)
Tran, B., Xue, B., Zhang, M.: Variable-length particle swarm optimization for feature selection on high-dimensional classification. IEEE Trans. Evol. Comput. 23(3), 473–487 (2018)
While, L., Hingston, P., Barone, L., Huband, S.: A faster algorithm for calculating hypervolume. IEEE Trans. Evol. Comput. 10(1), 29–38 (2006)
Xu, H., Xue, B., Zhang, M.: A duplication analysis based evolutionary algorithm for bi-objective feature selection. IEEE Trans. Evol. Comput. https://doi.org/10.1109/TEVC20203016049
Xue, B., Zhang, M., Browne, W.N.: Particle swarm optimization for feature selection in classification: a multi-objective approach. IEEE Trans. Cybern. 43(6), 1656–1671 (2012)
Xue, B., Zhang, M., Browne, W.N., Yao, X.: A survey on evolutionary computation approaches to feature selection. IEEE Trans. Evol. Comput. 20(4), 606–626 (2015)
Yue, C., Liang, J., Qu, B., Han, Y., Zhu, Y., Crisalle, O.D.: A novel multiobjective optimization algorithm for sparse signal reconstruction. Signal Process. 167, (2020)
Yue, C., Liang, J., Qu, B., Yu, K., Song, H.: Multimodal multiobjective optimization in feature selection. In: 2019 IEEE Congress on Evolutionary Computation, pp. 302–309 (2019)
Yue, C., Qu, B., Liang, J.: A multiobjective particle swarm optimizer using ring topology for solving multimodal multiobjective problems. IEEE Trans. Evol. Comput. 22(5), 805–817 (2017)
Zhang, X., Fang, L., Hipel, K.W., Ding, S., Tan, Y.: A hybrid project portfolio selection procedure with historical performance consideration. Expert Syst. Appl. 142, (2020)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Wang, P., Xue, B., Liang, J., Zhang, M. (2021). Improved Crowding Distance in Multi-objective Optimization for Feature Selection in Classification. In: Castillo, P.A., Jiménez Laredo, J.L. (eds) Applications of Evolutionary Computation. EvoApplications 2021. Lecture Notes in Computer Science(), vol 12694. Springer, Cham. https://doi.org/10.1007/978-3-030-72699-7_31
Download citation
DOI: https://doi.org/10.1007/978-3-030-72699-7_31
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-72698-0
Online ISBN: 978-3-030-72699-7
eBook Packages: Computer ScienceComputer Science (R0)