Skip to main content
Log in

Joint Adaptive Graph Learning and Discriminative Analysis for Unsupervised Feature Selection

  • Published:
Cognitive Computation Aims and scope Submit manuscript

Abstract

Unsupervised feature selection plays a dominant role in the process of high-dimensional and unlabeled data. Conventional spectral-based unsupervised feature selection methods always learn the subspace based on the predefined graph which constructed by the original features. Therefore, if the data is corrupted by the noise or redundancy existing in the high-dimensional, then the graph will be incorrect and further degrade the performance of downstream tasks. In this paper, we propose a new unsupervised feature selection method, in which the graph is self-adjusting by the original graph and learned subspace, so as to be the optimal one. Besides, the uncorrelated constraint is added to enhance the discriminability of the model. To optimize the model, we propose an alternative iterative algorithm and provide strict convergence proof. Extensive experiments are conducted to evaluate the performance of our method in comparison with other SOTA methods. The proposed adaptive graph learning strategy can learn a high-quality graph with the information of data structure more accurate. Besides, the uncorrelated constraint extremely ensures the discriminability of selected features.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

Notes

  1. http://archive.ics.uci.edu/ml/data

  2. http://www.escience.cn/system/file?fileId=82035

References

  1. Hammer P. Adaptive control processes: A guided tour (r. bellman). Siam Review - SIAM REV 4. 1962. https://doi.org/10.1137/1004050.

  2. Mafarja M, Qasem A, Heidari AA, Aljarah I, Faris H, Mirjalili S. Efficient hybrid nature-inspired binary optimizers for feature selection. Cogn Comput. 2020;12(1):150–75.

    Article  Google Scholar 

  3. Xu J, Yang G, Yin Y, Man H, He H. Sparse-representation-based classification with structure-preserving dimension reduction. Cogn Comput. 2014;6(3):608–21.

    Article  Google Scholar 

  4. Zhou R, Niu L. Feature selection of network data via l2, p regularization. Cogn Comput. 2020;12(6):1217–32.

    Article  Google Scholar 

  5. Cai D, Zhang C, He X. Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining. ACM. 2010;333–342.

  6. He X, Cai D, Niyogi P. Laplacian score for feature selection. In: Proceedings of the 18th International Conference on Neural Information Processing Systems, NIPS’05. MIT Press, Cambridge, MA, USA. 2005;507–514.

  7. Nie F, Zhu W, Li X. Unsupervised feature selection with structured graph optimization. In: Thirtieth AAAI Conf Artif Intell. 2016.

  8. Belhumeur PN, Hespanha JAP, Kriegman DJ. Eigenfaces vs. fisherfaces: Recognition using class specific linear projection. IEEE Trans Patt Anal Mach Intell 1997;19(7):711–720. https://doi.org/10.1109/34.598228.

  9. Nie F, Wang Z, Wang R, Li X. Submanifold-preserving discriminant analysis with an auto-optimized graph. IEEE Trans Cybernet. 2020;50(8):3682–95. https://doi.org/10.1109/TCYB.2019.2910751.

    Article  Google Scholar 

  10. Nie F, Wang Z, Wang R, Wang Z, Li X. Towards robust discriminative projections learning via non-greedy \({l}_{2,1}\)-norm minmax. IEEE Trans Patt Anal Mach Intell. 2019;1–1. https://doi.org/10.1109/TPAMI.2019.2961877.

  11. Wang Z, Nie F, Wang R, Yang H, Li X. Local structured feature learning with dynamic maximum entropy graph. Patt Recogn. 2020;111:107673. https://doi.org/10.1016/j.patcog.2020.107673.

    Article  Google Scholar 

  12. Ding C, Peng H. Minimum redundancy feature selection from microarray gene expression data. CSB ’03. IEEE Computer Society, USA. 2003;523.

  13. Lee WP, Lin CH. Combining expression data and knowledge ontology for gene clustering and network reconstruction. Cogn Comput. 2016;8(2):217–27.

    Article  MathSciNet  Google Scholar 

  14. Yang Y, Pedersen JO. A comparative study on feature selection in text categorization. In: Proceedings of the Fourteenth International Conference on Machine Learning, ICML ’97, p. 412–420. Morgan Kaufmann Publishers Inc., San Francisco, CA, USA. 1997.

  15. Nie F, Wang Z, Tian L, Wang R, Li X. Subspace sparse discriminative feature selection. IEEE Trans Cybernet. 2020. https://doi.org/10.1109/TCYB.2020.3025205.

    Article  Google Scholar 

  16. Pang T, Nie F, Han J, Li X. Efficient feature selection via l2, 0-norm constrained sparse regression. IEEE Trans Knowl Data Eng. 2018;31(5):880–93.

    Article  Google Scholar 

  17. Wang Z, Nie F, Tian L, Wang R, Li X. Discriminative feature selection via a structured sparse subspace learning module. In: Proc. Twenty-Ninth Int. Joint Conf Artif Intell. 2020;3009–3015.

  18. Yan H, Yang J, Yang J. Robust joint feature weights learning framework. IEEE Trans Knowl Data Eng. 2016;28(5):1327–39.

    Article  Google Scholar 

  19. Dy JG, Brodley CE. Feature selection for unsupervised learning. J Mach Learn Res 5. 2004;845–889 .

  20. Li Z, Yang Y, Liu J, Zhou X, Lu H. Unsupervised feature selection using nonnegative spectral analysis. In: Twenty-Sixth AAAI Conf Artif Intell. 2012.

  21. Qian M, Zhai C. Joint adaptive loss and l 2/l 0-norm minimization for unsupervised feature selection. In: 2015 International Joint Conference on Neural Networks (IJCNN). IEEE 2015;1–8.

  22. Yang Y, Shen HT, Ma Z, Huang Z, Zhou X. L2, 1-norm regularized discriminative feature selection for unsupervised learning. In: Twenty-Second Int Joint Conf Artif Intell. 2011.

  23. Shi L, Du L, Shen Y. Robust spectral learning for unsupervised feature selection. In: 2014 IEEE International Conference on Data Mining. 2014;977–982.

  24. Lee DD, Seung HS. Algorithms for non-negative matrix factorization. In: Proceedings of the 13th International Conference on Neural Information Processing Systems, NIPS-00. MIT Press, Cambridge, MA, USA. 2000;535–541.

  25. Gu Q, Li Z, Han J. Joint feature selection and subspace learning. In: Proceedings of the Twenty-Second International Joint Conference on Artificial Intelligence - Volume Volume Two, IJCAI-11. AAAI Press. 2011;1294–1299.

  26. Hou C, Nie F, Li X, Yi D, Wu Y. Joint embedding learning and sparse regression: A framework for unsupervised feature selection. IEEE Trans Cybernet. 2013;44(6):793–804.

    Google Scholar 

  27. Du X, Nie F, Wang W, Yang Y, Zhou X. Exploiting combination effect for unsupervised feature selection by l2, 0-norm. IEEE Trans Neural Net Learn Syst. 2018;30(1):201–14.

    Article  Google Scholar 

  28. Li X, Zhang H, Zhang R, Liu Y, Nie F. Generalized uncorrelated regression with adaptive graph for unsupervised feature selection. IEEE Trans Neural Net Learn Syst. 2019;30(5):1587–95. https://doi.org/10.1109/TNNLS.2018.2868847.

    Article  MathSciNet  Google Scholar 

  29. Nie F, Wang X, Jordan MI, Huang H. The constrained laplacian rank algorithm for graph-based clustering. In: Proceedings of the Thirtieth AAAI Conf Artif Intell. AAAI-16. AAAI Press. 2016;1969–1976.

  30. Peng Y, Zhang L, Kong W, Nie F, Cichocki A. Joint structured graph learning and unsupervised feature selection. In: ICASSP 2019 - 2019 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP). 2019;3572–3576. https://doi.org/10.1109/ICASSP.2019.8682439.

  31. Fan K. On a theorem of weyl concerning eigenvalues of linear transformations i. Proceedings of the Natl Acad Sci United States of America. 1949;35:652–5. https://doi.org/10.1073/pnas.35.11.652.

    Article  MathSciNet  Google Scholar 

  32. He X, Niyogi P. Locality preserving projections (lpp). IEEE Transactions on Reliability - TR 16. 2002.

  33. Nie F, Huang H, Cai X, Ding CH. Efficient and robust feature selection via joint l2, 1-norm minimization. In: Adv Neural Info Proc Syst. 2010;1813–1821.

  34. Nie F, Wang X, Huang H. Clustering and projected clustering with adaptive neighbors. Proceedings of the ACM SIGKDD Int Conf Knowl Disc Data Mining. 2014. https://doi.org/10.1145/2623330.2623726.

    Article  Google Scholar 

  35. Huang J, Nie F, Huang H. A new simplex sparse learning model to measure data similarity for clustering. In: Proceedings of the 24th International Conference on Artificial Intelligence, IJCAI’15. AAAI Press. 2015;3569–3575.

  36. Rate C, Retrieval C. Columbia object image library (coil-20). Computer. 2011.

  37. Lyons MJ, Budynek J, Akamatsu S. Automatic classification of single facial images. IEEE Trans Patt Anal Mach Intell. 1999;21(12):1357–62.

    Article  Google Scholar 

  38. Papadimitriou CH, Steiglitz K. Combinatorial Optimization: Algorithms and Complexity. USA: Prentice-Hall Inc; 1982.

    MATH  Google Scholar 

Download references

Funding

This study was funded in part by the Guangdong Province Science and Technology Plan Projects (2017B010110011), the National Natural Science Foundation of China (No.61876002), the Key Natural Science Project of Anhui Provincial Education Department (KJ2018A0023), and the National Natural Science Foundation of Anhui Province (2008085MF191, 2008085QF306).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Li Qi.

Ethics declarations

Conflict of Interest

The author Haifeng Zhao declares that he has no conflict of interest. The author Qi Li declares that he has no conflict of interest. The author Zheng Wang declares that he has no conflict of interest. The author Feiping Nie declares that he has no conflict of interest.

Ethical Approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhao, H., Li, Q., Wang, Z. et al. Joint Adaptive Graph Learning and Discriminative Analysis for Unsupervised Feature Selection. Cogn Comput 14, 1211–1221 (2022). https://doi.org/10.1007/s12559-021-09875-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12559-021-09875-0

Keywords

Navigation