Abstract
Most existing feature selection methods select features by evaluating a criterion which measures their ability to preserve the similarity structure of a data graph. However, these methods dichotomise the process of constructing or learning the underlying data graph and subsequent feature ranking. Once the graph is determined so as to characterize the structure of the similarity data, it is left fixed in the following ranking or regression steps. As a result, the performance of feature selection is largely determined by the effectiveness of graph construction step. The key to constructing an effective similarity graph is to determine a data similarity matrix. In this paper we perform the problem of estimating or learning the data similarity matrix and data-regression as simultaneous tasks, to perform unsupervised spectral feature selection. Our new method learns the data similarity matrix by optimally re-assigning the neighbors for each data point based on local distances or dis-similarities. Meanwhile, the \(\ell _{2,1}\)-norm is imposed to the transformation matrix to achieve row sparsity, which leads to the selection of relevant features. We derive an efficient optimization method to solve the simultaneous feature similarity graph and feature selection problems. Extensive experimental results on real-world benchmark data sets shows that our method consistently outperforms the alternative feature selection methods.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
He, X., Cai, D., Niyogi, P.: Laplacian score for feature selection. In: Advances in Neural Information Processing Systems, pp. 507–514 (2005)
Zhao, Z., Liu, H.: Spectral feature selection for supervised and unsupervised learning. In: Proceedings of the 24th International Conference on Machine Learning, pp. 1151–1157 (2007)
Cai, D., Zhang, C., He, X.: Unsupervised feature selection for multi-cluster data. In: Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 333–342 (2010)
Zhao, Z., Wang, L., Liu, H.: Efficient Spectral Feature Selection with Minimum Redundancy. In: Proceedings of AAAI, pp. 673–678 (2010)
Hou, C., Nie, F., Yi, D., Wu, Y.: Joint embedding learning and sparse regression: A framework for unsupervised feature selection. IEEE Transactions on Cybernetics 44(6), 793–804 (2014)
Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation 15(6), 1373–1396 (2003)
He, X., Niyogi, P.: Locality preserving projections, Neural information processing systems. MIT Press, Cambridge (2003)
Chan, P.K., Jordan, M., Weiss, Y.: Spectral k-way ratio-cut partitioning and clustering. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems 13(9), 1088–1096 (1994)
Ng, A.Y., Niyogi, P.: On spectral clustering: Analysis and an algorithm. Advances in Neural Information Processing Systems 2, 849–856 (2002)
Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. Advances in Neural Information Processing Systems 14, 585–591 (2001)
Fan, K.: On a theorem of Weyl concerning eigenvalues of linear transformations I. Proceedings of the National Academy of Sciences of the United States of America 35(11), 652–655 (1949)
Boyd, S., Vandenberghe, L.: Convex optimization. Cambridge University Press (2004)
Chang, C., Lin, C.: Libsvm: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology (2011)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Zhang, Z., Bai, L., Liang, Y., Hancock, E.R. (2015). Adaptive Graph Learning for Unsupervised Feature Selection. In: Azzopardi, G., Petkov, N. (eds) Computer Analysis of Images and Patterns. CAIP 2015. Lecture Notes in Computer Science(), vol 9256. Springer, Cham. https://doi.org/10.1007/978-3-319-23192-1_66
Download citation
DOI: https://doi.org/10.1007/978-3-319-23192-1_66
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-23191-4
Online ISBN: 978-3-319-23192-1
eBook Packages: Computer ScienceComputer Science (R0)