Abstract
Feature selection is an important component of many machine learning applications. In this paper, we propose a new robust feature selection method for multi-class multi-label learning. In particular, feature correlation is added into the sparse learning of feature selection so that we can learn the feature correlation and do feature selection simultaneously. An efficient algorithm is introduced with rapid convergence. Our regression based objective makes the feature selection process more efficient. Experiments on benchmark data sets illustrate that the proposed method outperforms many state-of-the-art feature selection methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Langley, P.: Selection of relevant features in machine learning. In: AAAI Fall Symposium on Relevance, pp. 140–144. AAAI Press, New Orleans (1994)
Peng, H., Long, F., Ding, C.: Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Analysis and Machine Intelligence 27, 1226–1238 (2005)
Raileanu, L.E., Stoffel, K.: Theoretical comparison between the gini index and information gain criteria. Annals of Mathematics and Artificial Intelligence 41, 77–93 (2000)
Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artificial Intelligence 97, 273–324 (1997)
Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Machine Learning Research 3, 1157–1182 (2003)
Tibshirani, R.: Regression shrinkage and selection via the LASSO. J. Royal Statist. Soc. B. 58, 267–288 (1996)
Yuan, M., Lin, Y.: Model selection and estimation in regression with grouped variables. J. Royal Statist. Soc. B 68, 49–67 (2006)
Obozinski, G., Taskar, B., Jordan, M.: Multi-task feature selection. Technical report, Department of Statistics, University of California, Berkeley (2006)
Argyriou, A., Evgeniou, T., Pontil, M.: Multi-task feature learning. In: 19th Ann. Conf. Neural Information Processing Systems, pp. 41–48. MIT Press, Cambridge (2007)
Zhang, M.L., Zhou, Z.H.: ML-KNN: A lazy learning approach to multi-label learning. Pattern Recognition 40, 2038–2048 (2007)
Criminis, A.: Microsoft research cambridge object recognition image dataset. version 1.0 (2004), http://researchMicrosoft.com/en-us/projects/objectclassrecognition/default.htm
Snoek, C., Worring, M., Gemert, J.V., Geusebroek, J.M., Smeulders, A.W.M.: The Challenge Problem for Automated Detection of 101 Semantic Concepts in Multimedia. In: 14th Annual ACM International Conference on Multimedia, pp. 421–430. ACM Press, New York (2006)
Boutell, M.R., Luo, J., Shen, X., Brown, C.M.: Learning multi-label scene classification. Pattern Recognition 37, 1757–1771 (2004)
Elisseeff, A., Weston, J.: A kernel method for multi-labelled classification. In: 14th Ann. Conf. Neural Information Processing Systems, pp. 681–687. MIT Press, Cambridge (2002)
Nie, F.P., Huang, H., Ding, C.: Efficient and Robust Feature Selection via Joint l2,1-Norms Minimization. In: 22th Ann. Conf. Neural Information Processing Systems, pp. 1813–1821. MIT Press, Cambridge (2010)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Huang, Ll., Tang, J., Chen, Sb., Ding, C., Luo, B. (2013). An Efficient Algorithm for Feature Selection with Feature Correlation. In: Yang, J., Fang, F., Sun, C. (eds) Intelligent Science and Intelligent Data Engineering. IScIDE 2012. Lecture Notes in Computer Science, vol 7751. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-36669-7_78
Download citation
DOI: https://doi.org/10.1007/978-3-642-36669-7_78
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-36668-0
Online ISBN: 978-3-642-36669-7
eBook Packages: Computer ScienceComputer Science (R0)