Summary
We give a tutorial overview of several geometric methods for feature extractionand dimensional reduction. We divide the methods into projective methods and methods thatmodel the manifold on which the data lies. For projective methods, we review projectionpursuit, principal component analysis (PCA), kernel PCA, probabilistic PCA, and orientedPCA; and for the manifold methods, we review multidimensional scaling (MDS), landmarkMDS, Isomap, locally linear embedding, Laplacian eigenmaps and spectral clustering. TheNyström method, which links several of the algorithms, is also reviewed. The goal is to providea self-contained review of the concepts and mathematics underlying these algorithms.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
M.A. Aizerman, E.M. Braverman, and L.I. Rozoner. Theoretical foundations of the potentialfunction method in pattern recognition learning. Automation and Remote Control, 25:821–837, 1964.
P.F. Baldi and K. Hornik. Learning in linear neural networks: A survey. IEEE Transactions on Neural Networks, 6(4):837–858, July 1995.
A. Basilevsky. Statistical Factor Analysis and Related Methods. Wiley, New York, 1994.
M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6):1373–1396, 2003.
Y. Bengio, J. Paiement, and P. Vincent. Out-of-sample extensions for LLE, Isomap, MDS, Eigenmaps and spectral clustering. In Advances in Neural Information Processing Systems 16. MIT Press, 2004.
C. Berg, J.P.R. Christensen, and P. Ressel. Harmonic Analysys on Semigroups. Springer-Verlag, 1984.
C. M. Bishop. Bayesian PCA. In M. S. Kearns, S. A. Solla, and D. A. Cohn, editors, Advances in Neural Information Processing Systems, volume 11, pages 382–388, Cambridge, MA, 1999. The MIT Press.
I. Borg and P. Groenen. Modern Multidimensional Scaling: Theory and Applications. Springer, 1997.
B. E. Boser, I. M. Guyon, and V. Vapnik. A training algorithm for optimal margin classifiers. In Fifth Annual Workshop on Computational Learning Theory, pages 144–152, Pittsburgh, 1992. ACM.
C.J.C. Burges. Some Notes on Applied Mathematics for Machine Learning. In O. Bousquet, U. von Luxburg, and G. Rätsch, editors, Advanced Lectures on Machine Learning, pages 21–40. Springer Lecture Notes in Aritificial Intelligence, 2004.
C.J.C. Burges, J.C. Platt, and S. Jana. Extracting noise-robust features from audio. In Proc. IEEE Conference on Acoustics, Speech and Signal Processing, pages 1021–1024. IEEESignal Processing Society, 2002.
C.J.C. Burges, J.C. Platt, and S. Jana. Distortion discriminant analysis for audio fingerprinting. IEEE Transactions on Speech and Audio Processing, 11(3):165–174, 2003.
F.R.K. Chung. Spectral Graph Theory. American Mathematical Society, 1997.
T.F. Cox and M.A.A. Cox., Multidimensional Scaling. Chapman and Hall, 2001. R.B. Darlington. Factor analysis. Technical report, Cornell University, http://comp9.psych.cornell.edu/Darlington/factor.htm.
V. de Silva and J.B. Tenenbaum. Global versus local methods in nonlinear dimensionalityreduction. In S. Becker, S. Thrun, and K. Obermayer, editors, Advances in Neural Information Processing Systems 15, pages 705–712. MIT Press, 2002.
P. Diaconis and D. Freedman. Asymptotics of graphical projection pursuit. Annals of Statistics, 12:793–815, 1984.
K.I. Diamantaras and S.Y. Kung. Principal Component Neural Networks. JohnWiley, 1996.
R.O. Duda and P.E. Hart. Pattern Classification and Scene Analysis. John Wiley, 1973.
C. Fowlkes, S. Belongie, F. Chung, and J. Malik. Spectral grouping using the Nyström method. IEEE Trans. Pattern Analysis and Machine Intelligence, 26 (2), 2004.
J.H. Friedman and W. Stuetzle. Projection pursuit regression. Journal of the American Statistical Association, 76(376):817–823, 1981.
J.H. Friedman, W. Stuetzle, and A. Schroeder. Projection pursuit density estimation. J. Amer. Statistical Assoc., 79:599–608, 1984.
J.H. Friedman and J.W. Tukey. A projection pursuit algorithm for exploratory data analysis. IEEE Transactions on Computers, c-23(9):881–890, 1974.
G.H. Golub and C.F. Van Loan. Matrix Computations. Johns Hopkins, third edition, 1996.
M. Gondran and M. Minoux. Graphs and Algorithms. John Wiley and Sons, 1984.
I. Guyon. NIPS 2003 workshop on feature extraction: http://clopinet.com/isabelle/Projects/NIPS2003/.
J. Ham, D.D. Lee, S. Mika, and B. Schölkopf. A kernel view of dimensionality reduction ofmanifolds. In Proceedings of the International Conference on Machine Learning, 2004.
T.J. Hastie and W. Stuetzle. Principal curves. Journal of the American Statistical Association, 84(406):502–516, 1989.
R.A. Horn and C.R. Johnson. Matrix Analysis. Cambridge University Press, 1985.
P.J. Huber. Projection pursuit. Annals of Statistics, 13(2):435–475, 1985.
A. Hyvärinen, J. Karhunen, and E. Oja. Independent Component Analysis. Wiley, 2001.
Y. LeCun and Y. Bengio. Convolutional networks for images, speech and time-series. InM. Arbib, editor, The Handbook of Brain Theory and Neural Networks. MIT Press, 1995.
M. Meila and J. Shi. Learning segmentation by random walks. In Advances in Neural Information Processing Systems, pages 873–879, 2000.
S. Mika, B. Schölkopf, A. J. Smola, K.-R. M¨uller, M. Scholz, and G. Rätsch. Kernel PCAand de–noising in feature spaces. In M. S. Kearns, S. A. Solla, and D. A. Cohn, editors, 82 Christopher J.C. Burges Advances in Neural Information Processing Systems 11. MIT Press, 1999.
A. Y. Ng, M. I. Jordan, and Y. Weiss. On spectral clustering: analysis and an algorithm. In Advances in Neural Information Processing Systems 14. MIT Press, 2002.
J. Platt. Private Communication.
J. Platt. Fastmap, MetricMap, and Landmark MDS are all Nyström algorithms. In Z. Ghahramaniand R. Cowell, editors, Proc. 10th International Conference on Artificial Intelligence and Statistics, 2005.
W.H. Press, B.P. Flannery, S.A. Teukolsky, and W.T. Vettering. Numerical recipes in C: the art of scientific computing. Cambridge University Press, 2nd edition, 1992.
S.T. Roweis and L.K. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290(22):2323–2326, 2000.
I.J. Schoenberg. Remarks to maurice frechet’s article sur la définition axiomatique d’uneclasse d’espace distanciés vectoriellement applicable sur espace de hilbert. Annals of Mathematics, 36:724–732, 1935.
B. Schölkopf. The kernel trick for distances. In T.K. Leen, T.G. Dietterich, and V. Tresp, editors, Advances in Neural Information Processing Systems 13, pages 301–307. MIT Press, 2001.
B. Schölkopf and A. Smola. Learning with Kernels. MIT Press, 2002.
B. Schölkopf, A. Smola, and K-R. Muller. Nonlinear component analysis asa kernel eigenvalue problem. Neural Computation, 10(5):1299–1319, 1998.
J. Shi and J. Malik. Normalized cuts and image segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(8):888–905, 2000.
C.E. Spearman. ’General intelligence’ objectively determined and measured. American Journal of Psychology, 5:201–293, 1904.
C.J. Stone. Optimal global rates of convergence for nonparametric regression. Annals of Statistics, 10(4):1040–1053, 1982.
J.B. Tenenbaum. Mapping a manifold of perceptual observations. In Michael I. Jordan,Michael J. Kearns, and Sara A. Solla, editors, Advances in Neural Information Processing Systems, volume 10. The MIT Press, 1998.
M.E. Tipping and C.M. Bishop. Probabilistic principal component analysis. Journal of the Royal Statistical Society, 61(3):611, 1999A.
M.E. Tipping and C.M. Bishop. Mixtures of probabilistic principal component analyzers. Neural Computation, 11(2):443–482, 1999B.
P. Viola and M. Jones. Robust real-time object detection. In Second international workshop on statistical and computational theories of vision - modeling, learning, computing, and sampling, 2001.
S. Wilks. Mathematical Statistics. John Wiley, 1962.
C.K.I. Williams. On a Connection between Kernel PCA and Metric Multidimensional Scaling. In T.K. Leen, T.G. Dietterich, and V. Tresp, editors, Advances in Neural Information Processing Systems 13, pages 675–681. MIT Press, 2001.
C.K.I. Williams and M. Seeger. Using the Nyström method to speed up kernel machines. InLeen, Dietterich, and Tresp, editors, Advances in Neural Information Processing Systems 13, pages 682–688. MIT Press, 2001.
Acknowledgments
I thank John Platt for valuable discussions. Thanks also to Lawrence Saul, Bernhard Schölkopf, Jay Stokes and Mike Tipping for commenting on the manuscript.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer Science+Business Media, LLC
About this chapter
Cite this chapter
Burges, C.J. (2009). Geometric Methods for Feature Extraction and Dimensional Reduction - A Guided Tour. In: Maimon, O., Rokach, L. (eds) Data Mining and Knowledge Discovery Handbook. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-09823-4_4
Download citation
DOI: https://doi.org/10.1007/978-0-387-09823-4_4
Published:
Publisher Name: Springer, Boston, MA
Print ISBN: 978-0-387-09822-7
Online ISBN: 978-0-387-09823-4
eBook Packages: Computer ScienceComputer Science (R0)