Abstract
In this paper, a novel feature selection algorithm, named Feature Selection with Differences and Similarities (FSDS), is proposed. FSDS jointly exploits sample differences from global structure and similarities from local structure. To reduce the disturbance from noisy feature, a row-wise sparse constraint is also merged into the objective function. FSDS, then combines the underlying subspace features with original feature to construct a more reliable feature set. Furthermore, a joint version of FSDS (FSDS2) is introduced. To optimize the proposed two-step FSDS and the joint version FSDS2 we also design two efficient iterative algorithms. Experimental results on various datasets demonstrate the effectiveness of the proposed algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Wolf, L., Shashua, A.: Feature selection for unsupervised and supervised inference: the emergence of sparsity in a weightbased approach. J. Mach. Learn. Res. 6, 1855–1887 (2005)
Nie, F., Huang, H., Cai, X., Ding, C.: Efficient and robust feature selection via joint l2,1-norms minimization. In: Proceedings of the Advances in NIPS (2010)
Zhao, Z., Liu, H.: Semi-supervised feature selection via spectral analysis. In: Proceedings of the SDM (2007)
Xu, Z., King, I., Lyu, R.T., Jin, R.: Discriminative semi-supervised feature selection via manifold regularization. IEEE Trans. Neural Netw. 21(7), 1033–1047 (2010)
Yang, Y., Shen, H.T., Ma, Z., Huang, Z., Zhou, X.: l2,1-norm regularized discriminative feature selection for unsupervised learning. In: Proceedings of the 22nd IJCAI (2011)
He, X., Cai, D., Niyogi, P.: Laplacian score for feature selection. In: Proceedings of the Advances in NIPS (2005)
He, X., Niyogi, P.: Locality preserving projections. In: Proceeding of the International Conference on Neural Information Processing System (2003)
Cai, D., Zhang, C., He, X.: Unsupervised feature selection for multi-cluster data. In: Proceedings of the ACM SIGKDD International Conference on KDD, Washington, DC, USA (2010)
Hou, C., Nie, F., Li, X., Yi, D., Wu, Y.: Joint embedding learning and sparse regression: a framework for unsupervised feature selection. IEEE Trans. Cybern. 44(6), 793–804 (2014)
Li, Z., Yang, Y., Liu, J., Zhou, X., Lu, H.: Unsupervised feature selection using nonnegative spectral analysis. In: Proceedings of the Conference on AAAI (2012)
Shi, J., Malik, J.: Normalized cuts and image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 22(8), 888–905 (2000)
Li, Z., Yang, Y., Liu, J., Zhou, X., Lu, H.: Clustering-guided sparse structural learning for unsupervised feature selection. IEEE Trans. Knowl. Data Eng. 26(9), 2138–2150 (2014)
Li, Z., Liu, J., Tang, J., Lu, H.: Robust structured subspace learning for data representation. IEEE Trans. Pattern Anal. Mach. Intell. 37(10), 2085–2098 (2015)
Li, Z., Tan, J.: Unsupervised feature selection via nonnegative spectral analysis and redundancy control. IEEE Trans. Image Process. 24(12), 5343–5355 (2015)
Lee, D., Seung, H.: Learning the parts of objects by nonnegative matrix factorization. Nature 401, 788–791 (1999)
Lee, D., Seung, H.: Algorithms for nonnegative matrix factorization. In: NIPS (2001)
Data for MATLAB Hackers. http://cs.nyu.edu/~roweis/data.html. Accessed 16 Oct 2012
Lyons, M.J., Budynek, J., Akamatsu, S.: Automatic classification of single facial images. IEEE Trans. Pattern Anal. Mach. Intell. 21(12), 1357–1362 (1999)
Nene, S.A., Nayar, S.K., Murase, H.: Columbia object image library (COIL-20), Dept. Computation. Sci., Columbia Univ., New York, NY, USA, Techical report CUCS-005-96 (1996)
Luo, M., Chang, X., Nie, L., Yang, Y., Hauptmann, A.G., Zheng, Q.: An adaptive semisupervised feature analysis for video semantic recognition. IEEE Trans. Cybern. 48(2), 648–660 (2018)
Zhao, Z., He, X., Cai, D., Zhang, L., Ng, W., Zhuang, Y.: Graph regularized feature selection with data reconstruction. IEEE Trans. Knowl. Data Eng. 28(3), 689–700 (2016)
Cai, Z., Zhu, W.: Feature selection for multi-label classification using neighborhood preservation. IEEE/CAA J. Autom. Sinica 5(1), 320–330 (2018)
Zhou, N., Xu, Y., Cheng, H., Yuan, Z., Chen, B.: Maximum correntropy criterion-based sparse subspace learning for unsupervised feature selection. IEEE Trans. Circ. Syst. Video Technol. 29(2), 404–417 (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Li, X., You, N., Tan, J., Bi, N. (2020). Differences and Similarities Learning for Unsupervised Feature Selection. In: Lu, Y., Vincent, N., Yuen, P.C., Zheng, WS., Cheriet, F., Suen, C.Y. (eds) Pattern Recognition and Artificial Intelligence. ICPRAI 2020. Lecture Notes in Computer Science(), vol 12068. Springer, Cham. https://doi.org/10.1007/978-3-030-59830-3_24
Download citation
DOI: https://doi.org/10.1007/978-3-030-59830-3_24
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-59829-7
Online ISBN: 978-3-030-59830-3
eBook Packages: Computer ScienceComputer Science (R0)