Skip to main content
Log in

Unsupervised feature selection based on local structure learning

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Unsupervised feature selection has become a significant and ambitious issue due to vast amounts of high-dimensional unlabeled data in machine learning. Traditional unsupervised feature selection algorithms usually make use of the similarity matrix for feature selection, and they heavily rely on the learned structure. However, a large amount of actual data always contains many noise samples or features that may make the similarity matrix obtained from the original data unreliable. Using Ideal Local Structure Learning (LSL) method ,we propose a novel unsupervised feature selection to perform feature selection and local structure learning at the same time in this paper. In order that we can earn more exactly structure information, an ideal local structure with precisely c connected components of data (c is the number of clusters) is utilized to refine the similarity matrix. Moreover, in order to optimize our algorithm, an effectual and plain iterative algorithm is developed. Experiments on multiple public baseline datasets, including biomedical data, letter recognition digit data and face image data, reveals the outstanding performance of our algorithms in the most advanced aspects.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. http://www.face-rec.org/databases/

  2. http://www.ics.uci.edu/mlearn/MLSummary.html

References

  1. Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. In: Proceedings of the international conference on knowledge discovery and data mining, pp 333–342

  2. Constantinopoulos C, Titsias MK, Likas A (2006) Bayesian feature and model selection for gaussian mixture models. IEEE Trans Pattern Anal Mach Intell (6):1013–1018

  3. He X, Cai D, Niyogi P (2005) Laplacian score for feature selection. In: Proceedings of advances in neural information processing systems 18

  4. Hou C, Nie F, Yi D, Wu Y (2011) Feature selection via joint embedding learning and sparse regression. In: Proceedings of international joint conference on artificial intelligence, pp 1324–1329

  5. Huang J, Nie F, Huang H (2015) A new simplex sparse learning model to measure data similarity for clustering. In: Proceedings of the international conference on artificial intelligence, pp 3569–3575

  6. Jiang Y, Ren J (2011) Eigenvalue sensitive feature selection. In: Proceedings of the international conference on machine learning, pp 89–96

  7. Li J, Hu X, Jian L, Liu H (2017) Toward time-evolving feature selection on dynamic networks. In: IEEE international conference on data mining, pp 1003–1008

  8. Li Z, Yang Y, Liu J, Zhou X, Lu H (2012) Unsupervised feature selection using nonnegative spectral analysis. In: Proceedings of association for the advancement of artificial intelligence

  9. Liang D, Shen Z, Xuan L, Peng Z, Shen YD (2013) Local and global discriminative learning for unsupervised feature selection. In: Proceedings of the international conference on data mining, pp 131–140

  10. Liu Y, Jin R, Yang L (2006) Semi-supervised multi-label learning by constrained non-negative matrix factorization. In: Proceedings of the international conference on artificial intelligence, pp 421–427

  11. Liu Y, Liu K, Zhang C, Wang J, Wang X (2017) Unsupervised feature selection via diversity-induced self-representation. Neurocomputing 219:350–363

    Article  Google Scholar 

  12. Lu H, Li Y, Chen M, Kim H, Serikawa S (2017) Brain intelligence: go beyond artificial intelligence. Mobile Networks and Applications:1–8

  13. Nie F, Huang H, Cai X, Ding CH (2010) Efficient and robust feature selection via joint 2,1-norms minimization. In: Proceedings of advances in neural information processing systems, pp 1813–1821

  14. Nie F, Wang X, Jordan MI, Huang H (2016) The constrained laplacian rank algorithm for graph-based clustering. In: Proceedings of association for the advancement of artificial intelligence

  15. Nie F, Xu D, Tsang IW-H, Zhang C (2010) Flexible manifold embedding: a framework for semi-supervised and unsupervised dimension reduction. IEEE Trans Image Process 19(7):1921–1932

    Article  MathSciNet  Google Scholar 

  16. Qian M, Zhai C (2013) Robust unsupervised feature selection. In: Proceedings of international joint conference on artificial intelligence, pp 1621–1627

  17. Wang S, Tang J, Liu H (2015) Embedded unsupervised feature selection. In: Proceedings of association for the advancement of artificial intelligence, pp 470–476

  18. Yang Y, Shen HT, Ma Z, Huang Z, Zhou X (2011) 2,1-norm regularized discriminative feature selection for unsupervised learning. In: Proceedings of international joint conference on artificial intelligence, vol 22, pp 1589–1595

  19. Yang Y, Shen HT, Nie F, Ji R, Zhou X (2011) Nonnegative spectral clustering with discriminative regularization. In: Proceedings of association for the advancement of artificial intelligence, pp 555–560

  20. Zhao Z, Wang L, Liu H, et al. (2010) Efficient spectral feature selection with minimum redundancy. In: Proceedings of association for the advancement of artificial intelligence

  21. Zhao Z, Wang L, Liu H, Ye J (2013) On similarity preserving feature selection. IEEE Trans Knowl Data Eng 25(3):619–632

    Article  Google Scholar 

  22. Zhu P, Zuo W, Zhang L, Hu Q, Shiu SC (2015) Unsupervised feature selection by regularized self-representation. Pattern Recogn 48(2):438–446

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported in part by Plan Program of Tianjin Educational Science and Research (Grant no.2017KJ087), Tianjin Science and Technology Major Projects and Engineering (grant No.17ZXHLSY00040 and No.17ZXSCSY00090).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhitao Xiao.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Liu, Y., Geng, L., Zhang, F. et al. Unsupervised feature selection based on local structure learning. Multimed Tools Appl 79, 34571–34585 (2020). https://doi.org/10.1007/s11042-019-08549-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-019-08549-2

Keywords

Navigation