Skip to main content
Log in

Local structure preservation in Kernel space for feature selection

Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

For many machine learning and data mining tasks in the information explosion environment, one is often confronted with very high dimensional heterogeneous data. Demands for new methods to select discrimination and valuable features that are beneficial to classification and cluster have increased. In this paper, we propose a novel feature selection method to jointly map original data from input space to kernel space and conduct both subspace learning (via locality preserving projection) and feature selection (via a sparsity constraint). Specifically, the nonlinear relationship between data is explored adequately through mapping data from original low-dimensional space to kernel space. Meanwhile, the subspace learning technique is leveraged to preserve available information of local structure in ambient space. Last, by restricting the sparsity of the coefficient matrix, the weight of some features is 0. As a result, we eliminate redundant and irrelevant features and thus make our method select informative and distinguishing features. By comparing our proposed method with some state-of-the-art methods, the experimental results demonstrate that the proposed method outperformed the comparisons in terms of clustering task.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Notes

  1. http://archive.ics.uci.edu/ml/index.php

  2. http://featureselection.asu.edu/datasets.pha

References

  1. Alamri AA (2011) Theory and methodology on the global optimal solution to a general reverse logistics inventory model for deteriorating items and time-varying rates. Comput Ind Eng 60(2):236–247

    Article  Google Scholar 

  2. Baudat G, Anouar F (2000) Generalized discriminant analysis using a kernel approach. Neural Comput 12(10):2385–2404

    Article  Google Scholar 

  3. Baudat G, Anouar F (2003) Feature vector selection and projection using kernels. Neurocomputing 55(1):21–38

    Article  Google Scholar 

  4. Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. In: ACM SIGKDD international conference on knowledge discovery and data mining, pp 333–342

  5. Cao B, Shen D, Sun JT, Yang Q, Chen Z (2007) Feature selection in a kernel space. In: Proceedings of the twenty-fourth international conference on machine learning, pp 121–128

  6. Gao L, Guo Z, Zhang H, Xu X, Shen HT (2017) Video captioning with attention-based LSTM and semantic consistency. IEEE Trans Multimedia 19 (9):2045–2055

    Article  Google Scholar 

  7. Gu Q, Li Z, Han J (2011) Linear discriminant dimensionality reduction. In: Joint European conference on machine learning and knowledge discovery in databases, pp 549–564

    Chapter  Google Scholar 

  8. He X (2005) Locality preserving projections. University of Chicago

  9. Hou C, Nie F, Li X, Yi D, Wu Y (2014) Joint embedding learning and sparse regression: a framework for unsupervised feature selection. IEEE Trans Cybern 44(6):793–804

    Article  Google Scholar 

  10. Hu R, Zhu X, Cheng D, He W, Yan Y, Song J, Zhang S (2017) Graph self-representation method for unsupervised feature selection. Neurocomputing 220:130–137

    Article  Google Scholar 

  11. Lai H, Pan Y, Liu C, Lin L, Wu J (2013) Sparse learning-to-rank via an efficient primal-dual algorithm. IEEE Trans Comput 62(6):1221–1233

    Article  MathSciNet  Google Scholar 

  12. Lei C, Zhu X (2018) Unsupervised feature selection via local structure learning and sparse learning. Multimedia Tools Appl 77(22):29,605–29,622

    Article  Google Scholar 

  13. Nie F, Huang H, Cai X, Ding CHQ (2010) Efficient and robust feature selection via joint ?2,1-norms minimization. In: Advances in neural information processing systems 23: 24th annual conference on neural information processing systems 2010. Proceedings of a meeting held 6-9 December 2010, Vancouver, British Columbia, Canada, pp 1813–1821

  14. Nie F, Zhu W, Li X (2016) Unsupervised feature selection with structured graph optimization. In: Proceedings of the Thirtieth AAAI conference on artificial intelligence, February 12-17, 2016, Phoenix, Arizona, USA, pp 1302–1308

  15. Rahmani M, Atia G (2017) High dimensional low rank plus sparse matrix decomposition. IEEE Trans Sig Process 65(8):2004–2019

    Article  MathSciNet  Google Scholar 

  16. Rahmani M, Atia GK (2017) Coherence pursuit: fast, simple, and robust principal component analysis. IEEE Trans Sig Process 65(23):6260–6275

    Article  MathSciNet  Google Scholar 

  17. Song J, Shen HT, Wang J, Huang Z, Sebe N, Wang J (2016) A distance-computation-free search scheme for binary code databases. IEEE Trans Multimedia 18(3):484–495

    Article  Google Scholar 

  18. Song J, Gao L, Nie F, Shen HT, Yan Y, Sebe N (2016) Optimized graph learning using partial tags and multiple features for image and video annotation. IEEE Trans Image Process 25(11):4999–5011

    Article  MathSciNet  Google Scholar 

  19. Tabakhi S, Moradi P, Akhlaghian F (2014) An unsupervised feature selection algorithm based on ant colony optimization. Eng Appl Artif Intell 32(6):112–123

    Article  Google Scholar 

  20. Wang S, Tang J, Liu H (2015) Embedded unsupervised feature selection. In: Proceedings of the twenty-ninth AAAI conference on artificial intelligence, January 25-30, 2015, Austin, Texas, USA, pp 470–476

  21. Yang Y, Duan Y, Wang X, Huang Z, Xie N, Shen HT (2018) Hierarchical multi-clue modelling for POI popularity prediction with heterogeneous tourist information. IEEE Transactions on Knowledge and Data Engineering. https://doi.org/10.1109/TKDE.2018.2842190

    Article  Google Scholar 

  22. Zhang S, Li X, Zong M, Zhu X, Cheng D (2017) Learning k for knn classification. ACM TIST 8(3): 43:1–43:19

    Google Scholar 

  23. Zhang S, Li X, Zong M, Zhu X, Wang R (2018) Efficient knn classification with different numbers of nearest neighbors. IEEE Trans Neural Netw Learn Syst 29 (5):1774–1785

    Article  MathSciNet  Google Scholar 

  24. Zheng W, Zhu X, Zhu Y, Hu R, Lei C (2018) Dynamic graph learning for spectral feature selection. Multimed Tools Appl 77(22):29,739–29,755

    Article  Google Scholar 

  25. Zheng W, Zhu X, Wen G, Zhu Y, Yu H, Gan J (2018) Unsupervised feature selection by self-paced learning regularization. Pattern Recog Lett, https://doi.org/10.1016/j.patrec.2018.06.029

  26. Zhi X, Yan H, Fan J, Zheng S (2018) Efficient discriminative clustering via QR decomposition-based linear discriminant analysis. Knowl Based Syst 153:117–132

    Article  Google Scholar 

  27. Zhong Z, Chen L (2018) Joint subspace learning and sparse regression for feature selection in kernel space. In: Proceedings of the ninth International Conference on Applications and Techniques in Information Security, November 09-11, 2018, Nanning, Guangxi, China

  28. Zhu L, Miao L, Zhang D (2012) Iterative laplacian score for feature selection. In: Chinese conference on pattern recognition, pp 80–87

    Google Scholar 

  29. Zhu Y, Lucey S (2015) Convolutional sparse coding for trajectory reconstruction. IEEE Trans Pattern Anal Mach Intell 37(3):529–540

    Article  Google Scholar 

  30. Zhu X, Suk H, Wang L, Lee S, Shen D (2017) A novel relational regularization feature selection method for joint regression and classification in AD diagnosis. Med Image Anal 38:205–214

    Article  Google Scholar 

  31. Zhu X, Li X, Zhang S, Ju C, Wu X (2017) Robust joint graph sparse coding for unsupervised spectral feature selection. IEEE Trans Neural Netw Learn Syst 28(6):1263–1275

    Article  MathSciNet  Google Scholar 

  32. Zhu X, Suk H, Huang H, Shen D (2017) Low-rank graph-regularized structured sparse regression for identifying genetic biomarkers. IEEE Trans Big Data 3(4):405–414

    Article  Google Scholar 

  33. Zhu X, Li X, Zhang S, Xu Z, Yu L, Wang C (2017) Graph PCA hashing for similarity search. IEEE Trans Multimedia 19(9):2033–2044

    Article  Google Scholar 

  34. Zhu Y, Zhu X, Kim M, Kaufer D, Wu G (2017) A novel dynamic hyper-graph inference framework for computer assisted diagnosis of neuro-diseases. In: Proceedings of the 25th International Conference on Information Processing in Medical Imaging - IPMI 2017, Boone, NC, USA, June 25-30, 2017, pp 158–169

  35. Zhu X, Zhang S, Li Y, Zhang J, Yang L, Fang Y (2018) Low-rank sparse subspace for spectral clustering. IEEE Trans Knowl Data Eng, https://doi.org/10.1109/TKDE.2018.2858782

    Article  Google Scholar 

  36. Zhu X, Zhang S, Hu R, Zhu Y, Song J (2018) Local and global structure preservation for robust unsupervised spectral feature selection. IEEE Trans Knowl Data Eng 30(3):517–529

    Article  Google Scholar 

  37. Zhu X, Zhang S, He W, Hu R, Lei C, Zhu P (2018) One-step multi-view spectral clustering. IEEE Transactions on Knowledge and Data Engineering, https://doi.org/10.1109/TKDE.2018.2873378

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by the program of Research and development of intelligent logistics management system based on Beidou multifunctional information acquisition and monitoring terminal (Grant No: 2016AB04097).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Long Chen.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhong, Z., Chen, L. Local structure preservation in Kernel space for feature selection. Multimed Tools Appl 78, 33339–33356 (2019). https://doi.org/10.1007/s11042-018-6926-0

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-018-6926-0

Keywords

Navigation