Abstract
For most clustering algorithms, their performance will strongly depend on the data representation. In this paper, we attempt to obtain better data representations through feature selection, particularly for the Local Learning based Clustering (LLC) [1]. We assign a weight to each feature, and incorporate it into the built-in regularization of LLC algorithm to take into account of the relevance of each feature for the clustering. Accordingly, the weights are estimated iteratively with the clustering. We show that the resulting weighted regularization with an additional constraint on the weights is equivalent to a known sparse-promoting penalty, thus the weights for irrelevant features can be driven towards zero. Experiments on several benchmark datasets demonstrate the effectiveness of the proposed method.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Wu, M., Schölkopf, B.: A local learning approach for clustering. Advances in Neural Information Processing Systems 19, 1529–1536 (2007)
Dash, M., Choi, K., Scheuermann, P., Liu, H.: Feature selection for clustering-a filter solution. In: Proceedings of IEEE International Conference on Data Mining, pp. 115–122 (2002)
He, X., Cai, D., Niyogi, P.: Laplacian score for feature selection. Advances in Neural Information Processing Systems 18, 507–514 (2005)
Cheung, Y.M., Zeng, H.: Local kernel regression score for feature selection. IEEE Transactions on Knowledge and Data Engineering (in press, 2009)
Wolf, L., Shashua, A.: Feature selection for unsupervised and supervised inference: The emergence of sparsity in a weight-based approach. Journal of Machine Learning Research 6, 1855–1887 (2005)
Dy, J., Brodley, C.: Feature selection for unsupervised learning. Journal of Machine Learning Research 5, 845–889 (2004)
Law, M.H.C., Jain, A.K., Figueiredo, M.A.T.: Feature selection in mixture-based clustering. Advances in Neural Information Processing Systems 15, 609–616 (2003)
Roth, V., Lange, T.: Feature selection in clustering problems. Advances in Neural Information Processing Systems 16, 473–480 (2004)
Zeng, H., Cheung, Y.M.: Feature selection for clustering on high dimensional data. In: Proceedings of the Pacific Rim International Conference on Artificial Intelligence, pp. 913–922 (2008)
Ng, A., Jordan, M., Weiss, Y.: On spectral clustering: Analysis and an algorithm. Advances in Neural Information Processing Systems 14, 849–856 (2002)
Yu, S., Shi, J.: Multiclass spectral clustering. In: Proceedings of IEEE International Conference on Computer Vision, pp. 313–319 (2003)
Obozinski, G., Taskar, B., Jordan, M.: Multi-task feature selection. Technical Report (2006)
Alon, U., Barkai, N., Notterman, D., Gish, K., Ybarra, S., Mack, D., Levine, A.: Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays. Proceedings of the National Academy of Sciences 96, 6745–6750 (1999)
Khan, J., Wei, J., Ringnér, M., Saal, L., Ladanyi, M., Westermann, F., Berthold, F., Schwab, M., Antonescu, C., Peterson, C., et al.: Classification and diagnostic prediction of cancers using gene expression profiling and artificial neural networks. Nature Medicine 7, 673–679 (2001)
West, M., Blanchette, C., Dressman, H., Huang, E., Ishida, S., Spang, R., Zuzan, H., Olson Jr., J., Marks, J., Nevins, J.: Predicting the clinical status of human breast cancer by using gene expression profiles. In: Proceedings of the National Academy of Sciences, vol. 98, pp. 11462–11467 (2001)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zeng, H., Cheung, Ym. (2009). Feature Selection for Local Learning Based Clustering. In: Theeramunkong, T., Kijsirikul, B., Cercone, N., Ho, TB. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2009. Lecture Notes in Computer Science(), vol 5476. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-01307-2_38
Download citation
DOI: https://doi.org/10.1007/978-3-642-01307-2_38
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-01306-5
Online ISBN: 978-3-642-01307-2
eBook Packages: Computer ScienceComputer Science (R0)