Skip to main content
Log in

Unsupervised maximum margin feature selection via L 2,1-norm minimization

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In this article, we present an unsupervised maximum margin feature selection algorithm via sparse constraints. The algorithm combines feature selection and K-means clustering into a coherent framework. L 2,1-norm regularization is performed to the transformation matrix to enable feature selection across all data samples. Our method is equivalent to solving a convex optimization problem and is an iterative algorithm that converges to an optimal solution. The convergence analysis of our algorithm is also provided. Experimental results demonstrate the efficiency of our algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Zhao Z, Wang L, Liu H (2010) Efficient spectral feature selection with minimum redundancy. In: Proceedings of the 24th AAAI conference on artificial intelligence (AAAI-10), pp 673–678

  2. Liu H, Motoda H (1998) Feature selection for knowledge discovery and data mining. Springer, Berlin

    Book  MATH  Google Scholar 

  3. Cover TM, Thomas JA (2006) Elements of information theory, 2nd edn. Wiley-Interscience, Hoboken

    MATH  Google Scholar 

  4. Duda RO, Hart PE, Stork DG (2000) Pattern classification, 2nd edn. Wiley-Interscience, Hoboken

    Google Scholar 

  5. Rodgers JL, Nicewander WA (1988) Thirteen ways to look at the correlation coefficient. Am Stat 42(1):59–66

    Article  Google Scholar 

  6. Zhao Z, Liu H (2007) Spectral feature selection for supervised and unsupervised learning. In: Proceedings of the 24th international conference on machine learning (ICML-07), pp 1151–1157

  7. Zhao Z, Liu H (2007) Semi-supervised feature selection via spectral analysis. In: Proceedings of the 7th SIAM international conference on data mining (SDM-07), pp 641–646

  8. Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD international conference on knowledge discovery and data mining (KDD-10), pp 333–342

  9. Krzanowski WJ (1987) Selection of variables to preserve multivariate data structure, Using Principal Component Analysis. Appl Stat J R Stat Soc Ser C 36:22–33

    Article  Google Scholar 

  10. He X, Cai D, Niyogi P (2006) Laplacian score for feature selection. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-06)

  11. Boutsidis C, Mahoney MW, Drineas P (2009) Unsupervised feature selection for the k-means clustering problem. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-09)

  12. Li H, Jiang T, Zhang K (2006) Efficient and robust feature extraction by maximum margin criterion. IEEE Trans Neural Netw 17(1):157–165

    Article  Google Scholar 

  13. Ding C, Zhou D, He X, Zha H (2006) R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization. In: Proceedings of the 23rd international conference on machine learning (ICML-06), pp 281–288

  14. Argyriou A, Evgeniou T, Pontil M (2007) Multi-task feature learning. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-07)

  15. Obozinski G, Taskar B, Jordan M (2006) Multi-task feature selection. Technical report, Department of Statistics, University of California, Berkeley

  16. Liu J, Ji S, Ye J (2009) Multi-task feature learning via efficient L2,1-norm minimization. In: Proceedings of the twenty-fifth conference on uncertainty in artificial intelligence (UAI-09)

  17. Nie F, Huang H, Cai X, Ding C (2010) Efficient and robust feature selection via joint l2,1-norms minimization. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-10)

  18. http://images.ee.umist.ac.uk/danny/database.html

  19. http://www.zjucadcg.cn/dengcai/Data/FaceData.html

  20. http://www1.cs.columbia.edu/CAVE/research/softlib/coil-20.html

  21. http://archive.ics.uci.edu/ml/machine-learning-databases/isolet

  22. http://archive.ics.uci.edu/ml/machine-learning-databases/undocumented/connectionist-bench/sonar/

  23. http://archive.ics.uci.edu/ml/machine-learning-databases/breast-cancer-wisconsin/

Download references

Acknowledgments

We gratefully acknowledge the supports from National Natural Science Foundation of China, under Grant No. 60975038 and Grant No. 60105003.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shizhun Yang.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yang, S., Hou, C., Nie, F. et al. Unsupervised maximum margin feature selection via L 2,1-norm minimization. Neural Comput & Applic 21, 1791–1799 (2012). https://doi.org/10.1007/s00521-012-0827-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-012-0827-3

Keywords

Navigation