Abstract
Unsupervised feature selection has become a significant and ambitious issue due to vast amounts of high-dimensional unlabeled data in machine learning. Traditional unsupervised feature selection algorithms usually make use of the similarity matrix for feature selection, and they heavily rely on the learned structure. However, a large amount of actual data always contains many noise samples or features that may make the similarity matrix obtained from the original data unreliable. Using Ideal Local Structure Learning (LSL) method ,we propose a novel unsupervised feature selection to perform feature selection and local structure learning at the same time in this paper. In order that we can earn more exactly structure information, an ideal local structure with precisely c connected components of data (c is the number of clusters) is utilized to refine the similarity matrix. Moreover, in order to optimize our algorithm, an effectual and plain iterative algorithm is developed. Experiments on multiple public baseline datasets, including biomedical data, letter recognition digit data and face image data, reveals the outstanding performance of our algorithms in the most advanced aspects.




Similar content being viewed by others
References
Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. In: Proceedings of the international conference on knowledge discovery and data mining, pp 333–342
Constantinopoulos C, Titsias MK, Likas A (2006) Bayesian feature and model selection for gaussian mixture models. IEEE Trans Pattern Anal Mach Intell (6):1013–1018
He X, Cai D, Niyogi P (2005) Laplacian score for feature selection. In: Proceedings of advances in neural information processing systems 18
Hou C, Nie F, Yi D, Wu Y (2011) Feature selection via joint embedding learning and sparse regression. In: Proceedings of international joint conference on artificial intelligence, pp 1324–1329
Huang J, Nie F, Huang H (2015) A new simplex sparse learning model to measure data similarity for clustering. In: Proceedings of the international conference on artificial intelligence, pp 3569–3575
Jiang Y, Ren J (2011) Eigenvalue sensitive feature selection. In: Proceedings of the international conference on machine learning, pp 89–96
Li J, Hu X, Jian L, Liu H (2017) Toward time-evolving feature selection on dynamic networks. In: IEEE international conference on data mining, pp 1003–1008
Li Z, Yang Y, Liu J, Zhou X, Lu H (2012) Unsupervised feature selection using nonnegative spectral analysis. In: Proceedings of association for the advancement of artificial intelligence
Liang D, Shen Z, Xuan L, Peng Z, Shen YD (2013) Local and global discriminative learning for unsupervised feature selection. In: Proceedings of the international conference on data mining, pp 131–140
Liu Y, Jin R, Yang L (2006) Semi-supervised multi-label learning by constrained non-negative matrix factorization. In: Proceedings of the international conference on artificial intelligence, pp 421–427
Liu Y, Liu K, Zhang C, Wang J, Wang X (2017) Unsupervised feature selection via diversity-induced self-representation. Neurocomputing 219:350–363
Lu H, Li Y, Chen M, Kim H, Serikawa S (2017) Brain intelligence: go beyond artificial intelligence. Mobile Networks and Applications:1–8
Nie F, Huang H, Cai X, Ding CH (2010) Efficient and robust feature selection via joint ℓ2,1-norms minimization. In: Proceedings of advances in neural information processing systems, pp 1813–1821
Nie F, Wang X, Jordan MI, Huang H (2016) The constrained laplacian rank algorithm for graph-based clustering. In: Proceedings of association for the advancement of artificial intelligence
Nie F, Xu D, Tsang IW-H, Zhang C (2010) Flexible manifold embedding: a framework for semi-supervised and unsupervised dimension reduction. IEEE Trans Image Process 19(7):1921–1932
Qian M, Zhai C (2013) Robust unsupervised feature selection. In: Proceedings of international joint conference on artificial intelligence, pp 1621–1627
Wang S, Tang J, Liu H (2015) Embedded unsupervised feature selection. In: Proceedings of association for the advancement of artificial intelligence, pp 470–476
Yang Y, Shen HT, Ma Z, Huang Z, Zhou X (2011) ℓ2,1-norm regularized discriminative feature selection for unsupervised learning. In: Proceedings of international joint conference on artificial intelligence, vol 22, pp 1589–1595
Yang Y, Shen HT, Nie F, Ji R, Zhou X (2011) Nonnegative spectral clustering with discriminative regularization. In: Proceedings of association for the advancement of artificial intelligence, pp 555–560
Zhao Z, Wang L, Liu H, et al. (2010) Efficient spectral feature selection with minimum redundancy. In: Proceedings of association for the advancement of artificial intelligence
Zhao Z, Wang L, Liu H, Ye J (2013) On similarity preserving feature selection. IEEE Trans Knowl Data Eng 25(3):619–632
Zhu P, Zuo W, Zhang L, Hu Q, Shiu SC (2015) Unsupervised feature selection by regularized self-representation. Pattern Recogn 48(2):438–446
Acknowledgments
This work was supported in part by Plan Program of Tianjin Educational Science and Research (Grant no.2017KJ087), Tianjin Science and Technology Major Projects and Engineering (grant No.17ZXHLSY00040 and No.17ZXSCSY00090).
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Liu, Y., Geng, L., Zhang, F. et al. Unsupervised feature selection based on local structure learning. Multimed Tools Appl 79, 34571–34585 (2020). https://doi.org/10.1007/s11042-019-08549-2
Received:
Revised:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11042-019-08549-2