Skip to main content

Interactive and Progressive Constraint Definition for Dimensionality Reduction and Visualization

  • Chapter
  • First Online:
Advances in Knowledge Discovery and Management

Part of the book series: Studies in Computational Intelligence ((SCI,volume 398))

Abstract

Projecting and visualizing objects in a two- or tree-dimension space is a standard data analysis task. In addition to this visualization it might be of interest to allow the user to add knowledge in the form of (di)similarity constraints among objects, when those appear either too close or too far in the observation space. In this paper we propose three kinds of constraints and present a resolution method that derives from PCA. Experiments have been performed with both synthetic and usual datasets. They show that a relevant representation can be achieved with a limited set of constraints.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Arrow, K., Hurwicz, L., Uzawa, H.: Studies in Nonlinear Programming. Stanford University Press, Stanford (1958)

    Google Scholar 

  2. Bar-Hillel, A., Hertz, T., Shental, N., Weinshall, D.: Learning a mahalanobis metric from equivalence constraints. Journal of Machine Learning Research 6, 937–965 (2005)

    MathSciNet  MATH  Google Scholar 

  3. Blum, A.L., Langley, P.: Selection of relevant features and examples in machine learning. Artificial Intelligence 97, 245–271 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  4. Da Costa, D., Venturini, G.: A visual and Interactive Data Exploration Method for Large Data Sets and Custering. In: Alhajj, R., Gao, H., Li, X., Li, J., Zaïane, O.R. (eds.) ADMA 2007. LNCS (LNAI), vol. 4632, pp. 553–561. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  5. Demartines, P., Hrault, J.: Curvilinear component analysis: A self-organizing neural network for nonlinear mapping of data sets. IEEE Transaction on Neural Networks 8(1), 148–154 (1997)

    Article  Google Scholar 

  6. Fisher, R.: The use of multiple measurements in taxonomic problems. Annals of Eugenics 7, 179–188 (1936)

    Article  Google Scholar 

  7. Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. Journal of Machine Learning Research 3, 1157–1182 (2003)

    MATH  Google Scholar 

  8. Lebart, L.: Analyse statistique de la contigut, vol. XVIII, pp. 81–112. Publications de L’Institut de Statistique des Universits de Paris (1969)

    Google Scholar 

  9. Pawlak, Z., Grzymala-Busse, J., Slowinski, R., Ziarko, W.: Rough sets. Commun. ACM 38(11), 88–95 (1995)

    Article  Google Scholar 

  10. Roweis, S.T., Saul, L.K.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290(5500), 2323–2326 (2000)

    Article  Google Scholar 

  11. Swiniarski, R.W., Skowron, A.: Rough set methods in feature selection and recognition. Pattern Recognition Letters 24(6), 833–849 (2003)

    Article  MATH  Google Scholar 

  12. Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)

    Article  Google Scholar 

  13. Weinberger, K.Q., Blitzer, J., Saul, L.K.: Distance metric learning for large margin nearest neighbor classification. In: NIPS (2005)

    Google Scholar 

  14. Weinberger, K.Q., Saul, L.K.: Unsupervised learning of image manifolds by semidefinite programming. International Journal of Computer Vision 70(1), 77–90 (2006)

    Article  Google Scholar 

  15. Weinberger, K.Q., Saul, L.K.: Fast solvers and efficient implementations for distance metric learning. In: Cohen, W.W., McCallum, A., Roweis, S.T. (eds.) ICML. ACM International Conference Proceeding Series, vol. 307, pp. 1160–1167. ACM (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lionel Martin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer Berlin Heidelberg

About this chapter

Cite this chapter

Martin, L., Exbrayat, M., Cleuziou, G., Moal, F. (2012). Interactive and Progressive Constraint Definition for Dimensionality Reduction and Visualization. In: Guillet, F., Ritschard, G., Zighed, D. (eds) Advances in Knowledge Discovery and Management. Studies in Computational Intelligence, vol 398. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-25838-1_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-25838-1_7

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-25837-4

  • Online ISBN: 978-3-642-25838-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics