Skip to main content

Supervised Feature Extraction Using Hilbert-Schmidt Norms

  • Conference paper
Intelligent Data Engineering and Automated Learning - IDEAL 2009 (IDEAL 2009)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 5788))

Abstract

We propose a novel, supervised feature extraction procedure, based on an unbiased estimator of the Hilbert-Schmidt independence criterion (HSIC). The proposed procedure can be directly applied to single-label or multi-label data, also the kernelized version can be applied to any data type, on which a positive definite kernel function has been defined. Computer experiments with various classification data sets reveal that our approach can be applied more efficiently than the alternative ones.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Borgwardt, K.M.: Graph kernels. Doctoral dissertation. München (2007)

    Google Scholar 

  2. Daniušis, P., Vaitkus, P.: Kernel regression on matrix patterns. Lithuanian Mathematical Journal. Spec. edition 48-49, 191–195 (2008)

    MATH  Google Scholar 

  3. Gärtner, T.: A survey of kernels for structured data. SIGKDD Explorations 5(1), 49–58 (2003)

    Article  Google Scholar 

  4. Gretton, A., Bousquet, O., Smola, A., Schölkopf, B.: Measuring statistical dependence with Hilbert-Schmidt norms. In: Jain, S., Simon, H.U., Tomita, E. (eds.) ALT 2005. LNCS (LNAI), vol. 3734, pp. 63–77. Springer, Heidelberg (2005)

    Chapter  Google Scholar 

  5. Fisher, R.A.: The use of multiple measurements in taxonomic problems. Annals of Eugenics 7, 179–188 (1936)

    Article  Google Scholar 

  6. Fukumizu, K., Bach, F.R., Jordan, M.I.: Dimensionality reduction for supervised learning with reproducing kernel Hilbert spaces. Journal of Machine Learning Research 5, 73–99 (2004)

    MathSciNet  MATH  Google Scholar 

  7. Hein, M., Bousquet, O.: Kernels, associated structures and generalizations. Tech. report (2004)

    Google Scholar 

  8. Jolliffe, I.T.: Principal component analysis. Springer, Berlin (1986)

    Book  MATH  Google Scholar 

  9. Kramer, M.A.: Nonlinear principal component analysis using autoassociative neural networks. AIChe journal 37, 233–243 (1991)

    Article  Google Scholar 

  10. Lodhi, H., Saunders, C., Shawe-Taylor, J., Cristianini, N., Watkins, C.: Text classification using string kernels. Journal of Machine Learning Research 2, 419–444 (2002)

    MATH  Google Scholar 

  11. Song, L., Smola, A., Gretton, A., Borgwardt, K., Bedo, J.: Supervised feature selection via dependence estimation. In: Proc. Intl. Conf. Machine Learning, pp. 823–830. Omnipress (2007)

    Google Scholar 

  12. Song, L., Smola, A., Borgwardt, K., Gretton, A.: Colored maximum variance unfolding. In: NIPS 20, pp. 1385–1392 (2008)

    Google Scholar 

  13. Zhang, Y., Zhi-Hua, Z.: Multi-label dimensionality reduction via dependence maximization. In: Proceedings of the Twenty-Third AAAI Conference on Artificial Intelligence (2008)

    Google Scholar 

  14. Zhang, M.-L., Zhou, Z.-H.: ML-kNN: a lazy learning approach to multi-label learning. Pattern Recognition 40(7), 2038–2048 (2007)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Daniušis, P., Vaitkus, P. (2009). Supervised Feature Extraction Using Hilbert-Schmidt Norms. In: Corchado, E., Yin, H. (eds) Intelligent Data Engineering and Automated Learning - IDEAL 2009. IDEAL 2009. Lecture Notes in Computer Science, vol 5788. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04394-9_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-04394-9_4

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-04393-2

  • Online ISBN: 978-3-642-04394-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics