Abstract
Projecting and visualizing objects in a two- or tree-dimension space is a standard data analysis task. In addition to this visualization it might be of interest to allow the user to add knowledge in the form of (di)similarity constraints among objects, when those appear either too close or too far in the observation space. In this paper we propose three kinds of constraints and present a resolution method that derives from PCA. Experiments have been performed with both synthetic and usual datasets. They show that a relevant representation can be achieved with a limited set of constraints.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Arrow, K., Hurwicz, L., Uzawa, H.: Studies in Nonlinear Programming. Stanford University Press, Stanford (1958)
Bar-Hillel, A., Hertz, T., Shental, N., Weinshall, D.: Learning a mahalanobis metric from equivalence constraints. Journal of Machine Learning Research 6, 937–965 (2005)
Blum, A.L., Langley, P.: Selection of relevant features and examples in machine learning. Artificial Intelligence 97, 245–271 (1997)
Da Costa, D., Venturini, G.: A visual and Interactive Data Exploration Method for Large Data Sets and Custering. In: Alhajj, R., Gao, H., Li, X., Li, J., Zaïane, O.R. (eds.) ADMA 2007. LNCS (LNAI), vol. 4632, pp. 553–561. Springer, Heidelberg (2007)
Demartines, P., Hrault, J.: Curvilinear component analysis: A self-organizing neural network for nonlinear mapping of data sets. IEEE Transaction on Neural Networks 8(1), 148–154 (1997)
Fisher, R.: The use of multiple measurements in taxonomic problems. Annals of Eugenics 7, 179–188 (1936)
Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. Journal of Machine Learning Research 3, 1157–1182 (2003)
Lebart, L.: Analyse statistique de la contigut, vol. XVIII, pp. 81–112. Publications de L’Institut de Statistique des Universits de Paris (1969)
Pawlak, Z., Grzymala-Busse, J., Slowinski, R., Ziarko, W.: Rough sets. Commun. ACM 38(11), 88–95 (1995)
Roweis, S.T., Saul, L.K.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290(5500), 2323–2326 (2000)
Swiniarski, R.W., Skowron, A.: Rough set methods in feature selection and recognition. Pattern Recognition Letters 24(6), 833–849 (2003)
Tenenbaum, J.B., de Silva, V., Langford, J.C.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)
Weinberger, K.Q., Blitzer, J., Saul, L.K.: Distance metric learning for large margin nearest neighbor classification. In: NIPS (2005)
Weinberger, K.Q., Saul, L.K.: Unsupervised learning of image manifolds by semidefinite programming. International Journal of Computer Vision 70(1), 77–90 (2006)
Weinberger, K.Q., Saul, L.K.: Fast solvers and efficient implementations for distance metric learning. In: Cohen, W.W., McCallum, A., Roweis, S.T. (eds.) ICML. ACM International Conference Proceeding Series, vol. 307, pp. 1160–1167. ACM (2008)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer Berlin Heidelberg
About this chapter
Cite this chapter
Martin, L., Exbrayat, M., Cleuziou, G., Moal, F. (2012). Interactive and Progressive Constraint Definition for Dimensionality Reduction and Visualization. In: Guillet, F., Ritschard, G., Zighed, D. (eds) Advances in Knowledge Discovery and Management. Studies in Computational Intelligence, vol 398. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-25838-1_7
Download citation
DOI: https://doi.org/10.1007/978-3-642-25838-1_7
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-25837-4
Online ISBN: 978-3-642-25838-1
eBook Packages: EngineeringEngineering (R0)