Skip to main content

Differences and Similarities Learning for Unsupervised Feature Selection

  • Conference paper
  • First Online:
Pattern Recognition and Artificial Intelligence (ICPRAI 2020)

Abstract

In this paper, a novel feature selection algorithm, named Feature Selection with Differences and Similarities (FSDS), is proposed. FSDS jointly exploits sample differences from global structure and similarities from local structure. To reduce the disturbance from noisy feature, a row-wise sparse constraint is also merged into the objective function. FSDS, then combines the underlying subspace features with original feature to construct a more reliable feature set. Furthermore, a joint version of FSDS (FSDS2) is introduced. To optimize the proposed two-step FSDS and the joint version FSDS2 we also design two efficient iterative algorithms. Experimental results on various datasets demonstrate the effectiveness of the proposed algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Wolf, L., Shashua, A.: Feature selection for unsupervised and supervised inference: the emergence of sparsity in a weightbased approach. J. Mach. Learn. Res. 6, 1855–1887 (2005)

    MathSciNet  MATH  Google Scholar 

  2. Nie, F., Huang, H., Cai, X., Ding, C.: Efficient and robust feature selection via joint l2,1-norms minimization. In: Proceedings of the Advances in NIPS (2010)

    Google Scholar 

  3. Zhao, Z., Liu, H.: Semi-supervised feature selection via spectral analysis. In: Proceedings of the SDM (2007)

    Google Scholar 

  4. Xu, Z., King, I., Lyu, R.T., Jin, R.: Discriminative semi-supervised feature selection via manifold regularization. IEEE Trans. Neural Netw. 21(7), 1033–1047 (2010)

    Article  Google Scholar 

  5. Yang, Y., Shen, H.T., Ma, Z., Huang, Z., Zhou, X.: l2,1-norm regularized discriminative feature selection for unsupervised learning. In: Proceedings of the 22nd IJCAI (2011)

    Google Scholar 

  6. He, X., Cai, D., Niyogi, P.: Laplacian score for feature selection. In: Proceedings of the Advances in NIPS (2005)

    Google Scholar 

  7. He, X., Niyogi, P.: Locality preserving projections. In: Proceeding of the International Conference on Neural Information Processing System (2003)

    Google Scholar 

  8. Cai, D., Zhang, C., He, X.: Unsupervised feature selection for multi-cluster data. In: Proceedings of the ACM SIGKDD International Conference on KDD, Washington, DC, USA (2010)

    Google Scholar 

  9. Hou, C., Nie, F., Li, X., Yi, D., Wu, Y.: Joint embedding learning and sparse regression: a framework for unsupervised feature selection. IEEE Trans. Cybern. 44(6), 793–804 (2014)

    Article  Google Scholar 

  10. Li, Z., Yang, Y., Liu, J., Zhou, X., Lu, H.: Unsupervised feature selection using nonnegative spectral analysis. In: Proceedings of the Conference on AAAI (2012)

    Google Scholar 

  11. Shi, J., Malik, J.: Normalized cuts and image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 22(8), 888–905 (2000)

    Article  Google Scholar 

  12. Li, Z., Yang, Y., Liu, J., Zhou, X., Lu, H.: Clustering-guided sparse structural learning for unsupervised feature selection. IEEE Trans. Knowl. Data Eng. 26(9), 2138–2150 (2014)

    Article  Google Scholar 

  13. Li, Z., Liu, J., Tang, J., Lu, H.: Robust structured subspace learning for data representation. IEEE Trans. Pattern Anal. Mach. Intell. 37(10), 2085–2098 (2015)

    Article  Google Scholar 

  14. Li, Z., Tan, J.: Unsupervised feature selection via nonnegative spectral analysis and redundancy control. IEEE Trans. Image Process. 24(12), 5343–5355 (2015)

    Article  MathSciNet  Google Scholar 

  15. Lee, D., Seung, H.: Learning the parts of objects by nonnegative matrix factorization. Nature 401, 788–791 (1999)

    Article  Google Scholar 

  16. Lee, D., Seung, H.: Algorithms for nonnegative matrix factorization. In: NIPS (2001)

    Google Scholar 

  17. Data for MATLAB Hackers. http://cs.nyu.edu/~roweis/data.html. Accessed 16 Oct 2012

  18. Lyons, M.J., Budynek, J., Akamatsu, S.: Automatic classification of single facial images. IEEE Trans. Pattern Anal. Mach. Intell. 21(12), 1357–1362 (1999)

    Article  Google Scholar 

  19. Nene, S.A., Nayar, S.K., Murase, H.: Columbia object image library (COIL-20), Dept. Computation. Sci., Columbia Univ., New York, NY, USA, Techical report CUCS-005-96 (1996)

    Google Scholar 

  20. Luo, M., Chang, X., Nie, L., Yang, Y., Hauptmann, A.G., Zheng, Q.: An adaptive semisupervised feature analysis for video semantic recognition. IEEE Trans. Cybern. 48(2), 648–660 (2018)

    Article  Google Scholar 

  21. Zhao, Z., He, X., Cai, D., Zhang, L., Ng, W., Zhuang, Y.: Graph regularized feature selection with data reconstruction. IEEE Trans. Knowl. Data Eng. 28(3), 689–700 (2016)

    Article  Google Scholar 

  22. Cai, Z., Zhu, W.: Feature selection for multi-label classification using neighborhood preservation. IEEE/CAA J. Autom. Sinica 5(1), 320–330 (2018)

    Article  MathSciNet  Google Scholar 

  23. Zhou, N., Xu, Y., Cheng, H., Yuan, Z., Chen, B.: Maximum correntropy criterion-based sparse subspace learning for unsupervised feature selection. IEEE Trans. Circ. Syst. Video Technol. 29(2), 404–417 (2019)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jun Tan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, X., You, N., Tan, J., Bi, N. (2020). Differences and Similarities Learning for Unsupervised Feature Selection. In: Lu, Y., Vincent, N., Yuen, P.C., Zheng, WS., Cheriet, F., Suen, C.Y. (eds) Pattern Recognition and Artificial Intelligence. ICPRAI 2020. Lecture Notes in Computer Science(), vol 12068. Springer, Cham. https://doi.org/10.1007/978-3-030-59830-3_24

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-59830-3_24

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-59829-7

  • Online ISBN: 978-3-030-59830-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics