Skip to main content

Feature Selection for Local Learning Based Clustering

  • Conference paper
Advances in Knowledge Discovery and Data Mining (PAKDD 2009)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 5476))

Included in the following conference series:

Abstract

For most clustering algorithms, their performance will strongly depend on the data representation. In this paper, we attempt to obtain better data representations through feature selection, particularly for the Local Learning based Clustering (LLC) [1]. We assign a weight to each feature, and incorporate it into the built-in regularization of LLC algorithm to take into account of the relevance of each feature for the clustering. Accordingly, the weights are estimated iteratively with the clustering. We show that the resulting weighted regularization with an additional constraint on the weights is equivalent to a known sparse-promoting penalty, thus the weights for irrelevant features can be driven towards zero. Experiments on several benchmark datasets demonstrate the effectiveness of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Wu, M., Schölkopf, B.: A local learning approach for clustering. Advances in Neural Information Processing Systems 19, 1529–1536 (2007)

    Google Scholar 

  2. Dash, M., Choi, K., Scheuermann, P., Liu, H.: Feature selection for clustering-a filter solution. In: Proceedings of IEEE International Conference on Data Mining, pp. 115–122 (2002)

    Google Scholar 

  3. He, X., Cai, D., Niyogi, P.: Laplacian score for feature selection. Advances in Neural Information Processing Systems 18, 507–514 (2005)

    Google Scholar 

  4. Cheung, Y.M., Zeng, H.: Local kernel regression score for feature selection. IEEE Transactions on Knowledge and Data Engineering (in press, 2009)

    Google Scholar 

  5. Wolf, L., Shashua, A.: Feature selection for unsupervised and supervised inference: The emergence of sparsity in a weight-based approach. Journal of Machine Learning Research 6, 1855–1887 (2005)

    MathSciNet  MATH  Google Scholar 

  6. Dy, J., Brodley, C.: Feature selection for unsupervised learning. Journal of Machine Learning Research 5, 845–889 (2004)

    MathSciNet  MATH  Google Scholar 

  7. Law, M.H.C., Jain, A.K., Figueiredo, M.A.T.: Feature selection in mixture-based clustering. Advances in Neural Information Processing Systems 15, 609–616 (2003)

    Google Scholar 

  8. Roth, V., Lange, T.: Feature selection in clustering problems. Advances in Neural Information Processing Systems 16, 473–480 (2004)

    Google Scholar 

  9. Zeng, H., Cheung, Y.M.: Feature selection for clustering on high dimensional data. In: Proceedings of the Pacific Rim International Conference on Artificial Intelligence, pp. 913–922 (2008)

    Google Scholar 

  10. Ng, A., Jordan, M., Weiss, Y.: On spectral clustering: Analysis and an algorithm. Advances in Neural Information Processing Systems 14, 849–856 (2002)

    Google Scholar 

  11. Yu, S., Shi, J.: Multiclass spectral clustering. In: Proceedings of IEEE International Conference on Computer Vision, pp. 313–319 (2003)

    Google Scholar 

  12. Obozinski, G., Taskar, B., Jordan, M.: Multi-task feature selection. Technical Report (2006)

    Google Scholar 

  13. Alon, U., Barkai, N., Notterman, D., Gish, K., Ybarra, S., Mack, D., Levine, A.: Broad patterns of gene expression revealed by clustering analysis of tumor and normal colon tissues probed by oligonucleotide arrays. Proceedings of the National Academy of Sciences 96, 6745–6750 (1999)

    Article  Google Scholar 

  14. Khan, J., Wei, J., Ringnér, M., Saal, L., Ladanyi, M., Westermann, F., Berthold, F., Schwab, M., Antonescu, C., Peterson, C., et al.: Classification and diagnostic prediction of cancers using gene expression profiling and artificial neural networks. Nature Medicine 7, 673–679 (2001)

    Article  Google Scholar 

  15. West, M., Blanchette, C., Dressman, H., Huang, E., Ishida, S., Spang, R., Zuzan, H., Olson Jr., J., Marks, J., Nevins, J.: Predicting the clinical status of human breast cancer by using gene expression profiles. In: Proceedings of the National Academy of Sciences, vol. 98, pp. 11462–11467 (2001)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zeng, H., Cheung, Ym. (2009). Feature Selection for Local Learning Based Clustering. In: Theeramunkong, T., Kijsirikul, B., Cercone, N., Ho, TB. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2009. Lecture Notes in Computer Science(), vol 5476. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-01307-2_38

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-01307-2_38

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-01306-5

  • Online ISBN: 978-3-642-01307-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics