Abstract
In this paper, we propose a novel linear dimensionality reduction algorithm, called Orthogonal Projection Analysis (OPA), from a gradient field perspective. Our approach is based on the following two criteria. First, the linear map should preserve the metric of the ambient space, which is based on the assumption that the metric of the ambient space is reliable. The second is the well-known smoothness criterion which is critical for clustering. Interestingly, gradient field is a natural tool to connect to these two requirements. We give a continuous objective function based on gradient fields and discuss how to discretize it by using tangent space. We also show the geometric meaning of our approach, which is requiring the gradient field as orthogonal as possible to the tangent spaces. The experimental results have demonstrated the effectiveness of our proposed approach.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. In: NIPS, pp. 585–591. MIT Press, Cambridge (2001)
Brand, M.: Charting a manifold. In: Advances in Neural Information Processing Systems, vol. 16 (2003)
Coifman, R.R., Lafon, S.: Diffusion maps. Applied and Computational Harmonic Analysis 21(1), 5–30 (2006), diffusion Maps and Wavelets
Cox, T., Cox, M.: Multidimensional Scalling. Chapman & Hall, London (1994)
Dollár, P., Rabaud, V., Belongie, S.: Non-isometric manifold learning: analysis and an algorithm. In: ICML 2007: Proceedings of the 24th International Conference on Machine Learning, pp. 241–248. ACM, New York (2007)
Donoho, D.L., Grimes, C.E.: Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proceedings of the National Academy of Sciences of the United States of America 100(10), 5591–5596 (2003)
Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley-Interscience, Hoboken (2000)
Goldberg, Y., Zakai, A., Kushnir, D., Ritov, Y.: Manifold learning: The price of normalization. The Journal of Machine Learning Research 9, 1909–1939 (2008)
Golub, G.H., van Loan, C.F.: Matrix computations, 3rd edn. Johns Hopkins University Press (1996)
Ham, J., Lee, D.D., Mika, S., Schölkopf, B.: A kernel view of the dimensionality reduction of manifolds. In: Proceedings of the Twenty-first International Conference on Machine Learning, Banff, Alberta, Canada (2004)
Hujun, Y.: Advances in adaptive nonlinear manifolds and dimensionality reduction. Frontiers of Electrical and Electronic Engineering in China 6, 72–85 (2011)
Jolliffe, I.T.: Principal Component Analysis. Springer, New York (1989)
Lafon, S., Lee, A.B.: Diffusion maps and coarse-graining: A unified framework for dimensionality reduction, graph partitioning, and data set parameterization. IEEE Transactions on Pattern Analysis and Machine Intelligence 28, 1393–1403 (2006)
Lin, T., Zha, H.: Riemannian manifold learning. IEEE Transactions on Pattern Analysis and Machine Intelligence 30(5), 796–809 (2008)
Nadler, B., Lafon, S., Coifman, R., Kevrekidis, I.: Diffusion maps, spectral clustering and eigenfunctions of fokker-planck operators. In: Weiss, Y., Schölkopf, B., Platt, J. (eds.) Advances in Neural Information Processing Systems, vol. 18, pp. 955–962. MIT Press, Cambridge (2006)
Roweis, S., Saul, L.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)
Schölkopf, B., Smola, A., Müller, K.R.: Nonlinear component analysis as a kernel eigenvalue problem. Neural Computation (10), 1299–1319 (1998)
Tenenbaum, J., de Silva, V., Langford, J.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)
Weinberger, K.Q., Sha, F., Saul, L.K.: Learning a kernel matrix for nonlinear dimensionality reduction. In: ICML 2004: Proceedings of the Twenty-first International Conference on Machine Learning, p. 106. ACM, New York (2004)
Zhang, Z., Zha, H.: Principal manifolds and nonlinear dimension reduction via local tangent space alignment. SIAM Journal of Scientific Computing 26(1) (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Lin, B., Zhang, C., He, X. (2012). Orthogonal Projection Analysis. In: Zhang, Y., Zhou, ZH., Zhang, C., Li, Y. (eds) Intelligent Science and Intelligent Data Engineering. IScIDE 2011. Lecture Notes in Computer Science, vol 7202. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-31919-8_1
Download citation
DOI: https://doi.org/10.1007/978-3-642-31919-8_1
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-31918-1
Online ISBN: 978-3-642-31919-8
eBook Packages: Computer ScienceComputer Science (R0)