Skip to main content

Features and Metric from a Classifier Improve Visualizations with Dimension Reduction

  • Conference paper
Artificial Neural Networks – ICANN 2009 (ICANN 2009)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5769))

Included in the following conference series:

Abstract

The goal of this work is to improve visualizations by using a task-related metric in dimension reduction. In supervised setting, metric can be learned directly from data or extracted from a model fitted to data. Here, two model-based approaches are tried: extracting a global metric from classifier parameters, and doing dimension reduction in feature space of a classifier. Both approaches are tested using four dimension reduction methods and four real data sets. Both approaches are found to improve visualization results. Especially working in classifier feature space is beneficial for showing possible cluster structure of the data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Rasmussen, C.E., Williams, C.K.I.: Gaussian processes for machine learning. MIT Press, Cambridge (2006)

    MATH  Google Scholar 

  2. Lampinen, J., Vehtari, A.: Bayesian approach for neural networks – review and case studies. Neural Networks 14(3), 7–24 (2001)

    Article  Google Scholar 

  3. Ye, J., Zhao, Z., Liu, H.: Adaptive distance metric learning for clustering. In: Proc. of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 1–7 (2007)

    Google Scholar 

  4. Xing, E.P., Ng, A.Y., Jordan, M.I., Russell, S.: Distance metric learning, with application to clustering with side-information. In: NIPS 15, pp. 505–512 (2002)

    Google Scholar 

  5. Weinberger, K.Q., Sha, F., Saul, L.K.: Learning a kernel matrix for nonlinear dimensionality reduction. In: Proc. 21st International Conference on Machine Learning, pp. 839–846 (2004)

    Google Scholar 

  6. Strickert, M., Schneider, P., Keilwagen, J., Villmann, T., Biehl, M., Hammer, B.H.: Discriminatory data mapping by matrix-based supervised learning metrics. In: Prevost, L., Marinai, S., Schwenker, F. (eds.) ANNPR 2008. LNCS (LNAI), vol. 5064, pp. 78–89. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  7. Globerson, A., Roweis, S.: Metric learning by collapsing classes. In: NIPS 18, pp. 451–458 (2005)

    Google Scholar 

  8. Lanckriet, G.R.G., Cristianini, N., Bartlett, P., El Ghaoui, L., Jordan, M.I.: Learning the kernel matrix with semidefinite programming. JMLR 5, 27–72 (2004)

    MathSciNet  MATH  Google Scholar 

  9. Jaakkola, T.S., Haussler, D.: Exploiting generative models in discriminative classifiers. In: NIPS 11 (1998)

    Google Scholar 

  10. Seeger, M.: Covariance kernels from Bayesian generative models. In: NIPS 14, pp. 905–912 (2002)

    Google Scholar 

  11. Peltonen, J., Klami, A., Kaski, S.: Learning more accurate metrics for self-organizing maps. In: Dorronsoro, J.R. (ed.) ICANN 2002. LNCS, vol. 2415, pp. 999–1004. Springer, Heidelberg (2002)

    Chapter  Google Scholar 

  12. Williams, C.K.I.: Computation with infinite neural networks. Neural Computation 10, 1203–1216 (1998)

    Article  Google Scholar 

  13. van der Maaten, L., Hinton, G.: Visualizing data using t-SNE. JMLR 9, 2579–2605 (2008)

    MATH  Google Scholar 

  14. Sammon, J.W.: A nonlinear mapping for data structure analysis. IEEE Transactions on Computers C-18(5), 401–409 (1969)

    Article  Google Scholar 

  15. Tenenbaum, J.B., de Silva, V., Langford: A global geometric framework for nonlinear dimensionality reduction. Science 290, 2319–2322 (2000)

    Article  Google Scholar 

  16. Schölkopf, B., Smola, A., Müller, K.: Kernel principal components analysis. In: Advances in kernel methods: support vector learning, pp. 327–352. MIT Press, Cambridge (1999)

    Google Scholar 

  17. Sund, R.: Methodological perspectives for register-based health system performance assessment. Developing a hip fracture monitoring system in Finland. Technical Report Stakes Research Report 174, National Research and Development Centre for Welfare and Health, Helsinki, Finland (2008)

    Google Scholar 

  18. Sund, R., Riihimäki, J., Mäkelä, M., Vehtari, A., Lüthje, P., Huusko, T., Häkkinen, U.: Modeling the length of the care episode after hip fracture: does the type of fracture matter? Scandinavian Journal of Surgery (in press, 2009)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Parviainen, E., Vehtari, A. (2009). Features and Metric from a Classifier Improve Visualizations with Dimension Reduction. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds) Artificial Neural Networks – ICANN 2009. ICANN 2009. Lecture Notes in Computer Science, vol 5769. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04277-5_23

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-04277-5_23

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-04276-8

  • Online ISBN: 978-3-642-04277-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics