Abstract
In this article, we present an unsupervised maximum margin feature selection algorithm via sparse constraints. The algorithm combines feature selection and K-means clustering into a coherent framework. L 2,1-norm regularization is performed to the transformation matrix to enable feature selection across all data samples. Our method is equivalent to solving a convex optimization problem and is an iterative algorithm that converges to an optimal solution. The convergence analysis of our algorithm is also provided. Experimental results demonstrate the efficiency of our algorithm.



Similar content being viewed by others
Explore related subjects
Discover the latest articles and news from researchers in related subjects, suggested using machine learning.References
Zhao Z, Wang L, Liu H (2010) Efficient spectral feature selection with minimum redundancy. In: Proceedings of the 24th AAAI conference on artificial intelligence (AAAI-10), pp 673–678
Liu H, Motoda H (1998) Feature selection for knowledge discovery and data mining. Springer, Berlin
Cover TM, Thomas JA (2006) Elements of information theory, 2nd edn. Wiley-Interscience, Hoboken
Duda RO, Hart PE, Stork DG (2000) Pattern classification, 2nd edn. Wiley-Interscience, Hoboken
Rodgers JL, Nicewander WA (1988) Thirteen ways to look at the correlation coefficient. Am Stat 42(1):59–66
Zhao Z, Liu H (2007) Spectral feature selection for supervised and unsupervised learning. In: Proceedings of the 24th international conference on machine learning (ICML-07), pp 1151–1157
Zhao Z, Liu H (2007) Semi-supervised feature selection via spectral analysis. In: Proceedings of the 7th SIAM international conference on data mining (SDM-07), pp 641–646
Cai D, Zhang C, He X (2010) Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD international conference on knowledge discovery and data mining (KDD-10), pp 333–342
Krzanowski WJ (1987) Selection of variables to preserve multivariate data structure, Using Principal Component Analysis. Appl Stat J R Stat Soc Ser C 36:22–33
He X, Cai D, Niyogi P (2006) Laplacian score for feature selection. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-06)
Boutsidis C, Mahoney MW, Drineas P (2009) Unsupervised feature selection for the k-means clustering problem. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-09)
Li H, Jiang T, Zhang K (2006) Efficient and robust feature extraction by maximum margin criterion. IEEE Trans Neural Netw 17(1):157–165
Ding C, Zhou D, He X, Zha H (2006) R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization. In: Proceedings of the 23rd international conference on machine learning (ICML-06), pp 281–288
Argyriou A, Evgeniou T, Pontil M (2007) Multi-task feature learning. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-07)
Obozinski G, Taskar B, Jordan M (2006) Multi-task feature selection. Technical report, Department of Statistics, University of California, Berkeley
Liu J, Ji S, Ye J (2009) Multi-task feature learning via efficient L2,1-norm minimization. In: Proceedings of the twenty-fifth conference on uncertainty in artificial intelligence (UAI-09)
Nie F, Huang H, Cai X, Ding C (2010) Efficient and robust feature selection via joint l2,1-norms minimization. In: Proceedings of the annual conference on advances in neural information processing systems (NIPS-10)
http://www1.cs.columbia.edu/CAVE/research/softlib/coil-20.html
http://archive.ics.uci.edu/ml/machine-learning-databases/isolet
http://archive.ics.uci.edu/ml/machine-learning-databases/undocumented/connectionist-bench/sonar/
http://archive.ics.uci.edu/ml/machine-learning-databases/breast-cancer-wisconsin/
Acknowledgments
We gratefully acknowledge the supports from National Natural Science Foundation of China, under Grant No. 60975038 and Grant No. 60105003.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Yang, S., Hou, C., Nie, F. et al. Unsupervised maximum margin feature selection via L 2,1-norm minimization. Neural Comput & Applic 21, 1791–1799 (2012). https://doi.org/10.1007/s00521-012-0827-3
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-012-0827-3