Skip to main content

Low Redundancy Learning for Unsupervised Multi-view Feature Selection

  • Conference paper
  • First Online:
Knowledge Science, Engineering and Management (KSEM 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14117))

  • 489 Accesses

Abstract

Multi-view feature selection is an important research direction in multi-view learning. Most of the existing multi-view feature selection methods focus on the correlation between features and data category structure, while ignoring the redundancy between features. In this paper, we propose a multi-view feature selection method based on low redundancy learning, which introduces and automatically assigns the weight of feature redundancy in each view to the projection space matrix. Subsequently, by applying \(l_{2,1}\) norm to the projection space matrix to constrain row sparsity, the feature subsets with high correlation and low redundancy can be selected. In order to make full use of the consistency of multiple views, we also utilize spectral analysis to learn the potential category structure of each view, and minimize the difference between single-view category structure and consensus clustering indication matrix. Finally, an alternating iterative updating method is presented to solve the optimization problem. Experiments on different public multi-view data sets verify the effectiveness of the proposed method.

This work was supported by the National Natural Science Foundation of China under Grant 61806131 and the Natural Science Foundation of Guangdong Province under Grant 2018A030310510.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bai, X., Zhu, L., Liang, C., Li, J., Nie, X., Chang, X.: Multi-view feature selection via nonnegative structured graph learning. Neurocomputing 387, 110–122 (2020)

    Article  Google Scholar 

  2. Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15(6), 1373–1396 (2003)

    Article  MATH  Google Scholar 

  3. Cai, D., Zhang, C., He, X.: Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 333–342 (2010)

    Google Scholar 

  4. Dong, X., Zhu, L., Song, X., Li, J., Cheng, Z.: Adaptive collaborative similarity learning for unsupervised multi-view feature selection. arXiv preprint arXiv:1904.11228 (2019)

  5. Feng, Y., Xiao, J., Zhuang, Y., Liu, X.: Adaptive unsupervised multi-view feature selection for visual concept recognition. In: Lee, K.M., Matsushita, Y., Rehg, J.M., Hu, Z. (eds.) ACCV 2012. LNCS, vol. 7724, pp. 343–357. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-37331-2_26

    Chapter  Google Scholar 

  6. He, X., Cai, D., Niyogi, P.: Laplacian score for feature selection. In: Advances in Neural Information Processing Systems, vol. 18 (2005)

    Google Scholar 

  7. Hou, C., Nie, F., Tao, H., Yi, D.: Multi-view unsupervised feature selection with adaptive similarity and view weight. IEEE Trans. Knowl. Data Eng. 29(9), 1998–2011 (2017)

    Article  Google Scholar 

  8. Lee, D.: Algorithms for non-negative matrix factorization. In: NIPS 2000 (2000)

    Google Scholar 

  9. Li, Z., Yang, Y., Liu, J., Zhou, X., Lu, H.: Unsupervised feature selection using nonnegative spectral analysis. In: Proceedings of the AAAI conference on artificial intelligence. vol. 26 (2012)

    Google Scholar 

  10. Nie, F., Huang, H., Cai, X., Ding, C.: Efficient and robust feature selection via joint l2, 1-norms minimization. In: Advances in Neural Information Processing Systems, vol. 23 (2010)

    Google Scholar 

  11. Shang, R., Xu, K., Shang, F., Jiao, L.: Sparse and low-redundant subspace learning-based dual-graph regularized robust feature selection. Knowl.-Based Syst. 187, 104830 (2020)

    Article  Google Scholar 

  12. Tang, C., et al.: Consensus learning guided multi-view unsupervised feature selection. Knowl.-Based Syst. 160, 49–60 (2018)

    Article  Google Scholar 

  13. Tang, C., et al.: Cross-view locality preserved diversity and consensus learning for multi-view unsupervised feature selection. IEEE Trans. Knowl. Data Eng. 34(10), 4705–4716 (2022)

    Article  Google Scholar 

  14. Tang, J., Hu, X., Gao, H., Liu, H.: Unsupervised feature selection for multi-view data in social media. In: Proceedings of the 2013 SIAM International Conference on Data Mining, pp. 270–278. SIAM (2013)

    Google Scholar 

  15. Wang, Z., Feng, Y., Qi, T., Yang, X., Zhang, J.J.: Adaptive multi-view feature selection for human motion retrieval. Signal Process. 120, 691–701 (2016)

    Article  Google Scholar 

  16. Xu, X., Wu, X., Wei, F., Zhong, W., Nie, F.: A general framework for feature selection under orthogonal regression with global redundancy minimization. IEEE Trans. Knowl. Data Eng. (2021)

    Google Scholar 

  17. Zhang, H., Wu, D., Nie, F., Wang, R., Li, X.: Multilevel projections with adaptive neighbor graph for unsupervised multi-view feature selection. Inf. Fusion 70, 129–140 (2021)

    Article  Google Scholar 

  18. Zhao, Z., Liu, H.: Spectral feature selection for supervised and unsupervised learning. In: Proceedings of the 24th International Conference on Machine Learning, pp. 1151–1157 (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hong Jia .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Jia, H., Huang, J. (2023). Low Redundancy Learning for Unsupervised Multi-view Feature Selection. In: Jin, Z., Jiang, Y., Buchmann, R.A., Bi, Y., Ghiran, AM., Ma, W. (eds) Knowledge Science, Engineering and Management. KSEM 2023. Lecture Notes in Computer Science(), vol 14117. Springer, Cham. https://doi.org/10.1007/978-3-031-40283-8_16

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-40283-8_16

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-40282-1

  • Online ISBN: 978-3-031-40283-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics