Abstract
Expressive evaluation metrics are indispensable for informative experiments in all areas, and while several metrics are established in some areas, in others, such as feature selection, only indirect or otherwise limited evaluation metrics are found. In this paper, we propose a novel evaluation metric to address several problems of its predecessors and allow for flexible and reliable evaluation of feature selection algorithms. The proposed metric is a dynamic metric with two properties that can be used to evaluate both the performance and the stability of a feature selection algorithm. We conduct several empirical experiments to illustrate the use of the proposed metric in the successful evaluation of feature selection algorithms. We also provide a comparison and analysis to show the different aspects involved in the evaluation of the feature selection algorithms. The results indicate that the proposed metric is successful in carrying out the evaluation task for feature selection algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
An extended version of this paper providing more details is available on arXiv [12].
References
Cao, X., Zhang, C., Fu, H., Liu, S., Zhang, H.: Diversity-induced Multi-view subspace clustering. In: CVPR. IEEE Computer Society (2015), pp. 586ā594
Chandrashekar, G., Sahin, F.: A survey on feature selection methods. Comput. Electr. Eng. 40(1), 16ā28 (2014)
Tomas B. Co.: Methods of Applied Mathematics for Engineers and Scientists. Cambridge University Press (2013)
Fan, Y., Liu, J., Tang, J., Liu, P., Lin, Y., Du, Y.: Learning correlation information for multi-label feature selection. Pattern Recogn. 145, 109899 (2024)
Guha, R., Khan, H.A., Singh, P.K., Sarkar, R., Bhattacharjee, D.: CGA: a new feature selection model for visual human action recognition. Neural Comput. Appl. 33(10), 5267ā5286 (2021)
Guo, Y., Sun, Y., Wang, Z., Nie, F., Wang, F.: Double-structured sparsity guided flexible embedding learning for unsupervised feature selection. IEEE Trans. Neural Netw. Learn. Syst. 1ā14 (2023)
Kelly, M., Longjohn, R., Nottingham, K.: The UCI Machine Learning Repository. https://archive.ics.uci.edu (2024)
Kuncheva, L.I.: A stability index for feature selection. In: Artificial Intelligence and Applications. IASTED/ACTA Press (2007), pp. 421ā427
Mostert, W., Malan, K.M., Engelbrecht, A.P.: A feature selection algorithm performance metric for comparative analysis. Algorithms 14(3), 100 (2021)
Nogueira, S., Sechidis, K., Brown, G.: On the stability of feature selection algorithms. J. Mach. Learn. Res. 18, 174:1ā174:54 (2017)
Pan, H., Chen, S., Xiong, H.: A high-dimensional feature selection method based on modified Gray Wolf optimization. Appl. Soft Comput. 135, 110031 (2023)
Rajabinasab, M., Lautrup, A.D., Hyrup, T., Zimek, A.: FSDEM: Feature Selection Dynamic Evaluation Metric (2024). arXiv: 2408.14234[cs.LG]. https://arxiv.org/abs/2408.14234
Wang, C., Wang, J., Gu, Z., Wei, J.-M., Liu, J.: Unsupervised feature selection by learning exponential weights. Pattern Recogn. 148, 110183 (2024)
Acknowledgement
This study was funded by Innovation Fund Denmark in the project āPREPARE: Personalized Risk Estimation and Prevention of Cardiovascular Diseaseā.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
Ā© 2025 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Rajabinasab, M., Lautrup, A.D., Hyrup, T., Zimek, A. (2025). A Dynamic Evaluation Metric for Feature Selection. In: ChĆ”vez, E., Kimia, B., LokoÄ, J., Patella, M., Sedmidubsky, J. (eds) Similarity Search and Applications. SISAP 2024. Lecture Notes in Computer Science, vol 15268. Springer, Cham. https://doi.org/10.1007/978-3-031-75823-2_6
Download citation
DOI: https://doi.org/10.1007/978-3-031-75823-2_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-75822-5
Online ISBN: 978-3-031-75823-2
eBook Packages: Computer ScienceComputer Science (R0)