ABSTRACT
In the past decade, a number of nonlinear dimensionality reduction methods using an affinity graph have been developed for manifold learning. This paper explores a multilevel framework with the goal of reducing the cost of unsupervised manifold learning and preserving the embedding quality at the same time. An application to spectral clustering is also presented. Experimental results indicate that our multilevel approach is an appealing alternative to standard techniques.
- S. T. Barnard and H. D. Simon. A fast multilevel implementation of recursive spectral bisection for partitioning unstructured problems. Concurrency: Practice and Experience, 6:101--107, 1994.Google ScholarCross Ref
- M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 15(6):1373--1396, 2003. Google ScholarDigital Library
- Y. Bengio, J.-F. Paiement, P. Vincent, O. Delalleau, N. L. Roux, and M. Ouimet. Out-of-sample extensions for LLE, Isomap, MDS, Eigenmaps, and spectral clustering. In Advances in Neural Information Processing Systems 16. MIT Press, Cambridge, MA, 2004.Google Scholar
- D. L. Boley. Principal direction divisive partitioning. Data Mining and Knowledge Discovery, 2(4):324--344, 1998. Google ScholarDigital Library
- T. Chan, B. Smith, and J. Zou. Multigrid and domain decomposition methods for unstructured meshes. In The 3rd International Conference on Advances in Numerical Methods and Applications, pages 53--62, Sofia, Bulgaria, 1994.Google Scholar
- J. Chen, H.-r. Fang, and Y. Saad. Fast approximate kNN graph construction for high dimensional data via recursive Lanczos bisection. J. of Machine Learning Research, 10:1989--2012, 2009. Google ScholarDigital Library
- R. R. Coifman and S. Lafon. Diffusion maps. Appl. Comput. Harmon. Anal., 21:5--30, 2006.Google ScholarCross Ref
- V. de Silva and J. B. Tenenbaum. Global versus local methods in nonlinear dimensionality reduction. In Advances in Neural Information Processing Systems 15, pages 705--712. MIT Press, Cambridge, MA, 2003.Google Scholar
- E. W. Dijkstra. A note on two problems in connexion with graphs. Numerische Mathematik, 1:269--271, 1959.Google ScholarDigital Library
- D. L. Donoho and C. Grimes. Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. In Proceedings of the National Academy of Arts and Sciences, pages 100:5591--5596, 2003.Google ScholarCross Ref
- G. B. Huang, M. Ramesh, T. Berg, and E. Learned-Miller. Labeled faces in the wild: A database for studying face recognition in unconstrained environments. Technical Report 07-49, University of Massachusetts, Amherst, October 2007.Google Scholar
- G. Karypis and V. Kumar. Multilevel k-way partitioning scheme for irregular graphs. J. Parallel Distrib. Comput., 48(1):96--129, 1998. Google ScholarDigital Library
- G. Karypis and V. Kumar. Multilevel k-way hypergraph partitioning. VLSI Design, 11(3):285--300, 2000.Google ScholarCross Ref
- S. Lafon and A. B. Lee. Diffusion maps and coarse-graining: a unified framework for dimensionality reduction, graph partitioning, and data set parameterization. IEEE Trans. Pattern Anal. Mach. Intell., 28(9):1393--1403, 2006. Google ScholarDigital Library
- R. B. Lehoucq, D. C. Sorensen, and C. Yang. ARPACK Users' Guide: Solution of Large-Scale Eigenvalue Problems with Implicitly Restarted Arnoldi Methods. SIAM Publications, 1998.Google ScholarCross Ref
- A. M. Martínez and A. C. Kak. PCA versus LDA. IEEE Trans. Pattern Anal. Mach. Intell., 23(2):228--233, 2001. Google ScholarDigital Library
- A. Y. Ng, M. Jordan, and Y. Weiss. On spectral clustering: Analysis and an algorithm. In Advances in Neural Information Processing Systems 14 (NIPS), 2002.Google Scholar
- S. T. Roweis and L. K. Saul. Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500):2323--2326, 2000.Google ScholarCross Ref
- S. Sakellaridi, H.-r. Fang, and Y. Saad. Graph-based multilevel dimensionality reduction with applications to eigenfaces and latent semantic indexing. In The 7th International Conference on Machine Learning and Applications (ICMLA), pages 194--200, Washington, DC, USA, 2008. IEEE Computer Society. Google ScholarDigital Library
- F. S. Samaria and A. C. Harter. Parameterisation of a stochastic model for human face identification. In The 2nd IEEE Workshop on Applications of Computer Vision, pages 138--142, 1994.Google ScholarCross Ref
- L. K. Saul and S. T. Roweis. Think globally, fit locally: Unsupervised learning of low dimensional manifolds. J. of Machine Learning Research, 4:119--155, 2003. Google ScholarDigital Library
- L. K. Saul, K. Q. Weinberger, J. H. Ham, F. Sha, and D. D. Lee. Spectral methods for dimensionality reduction. In Semisupervised Learning. MIT Press, Cambridge, MA, 2006.Google Scholar
- F. Sha and L. K. Saul. Analysis and extension of spectral methods for nonlinear dimensionality reduction. In The 22nd international conference on Machine learning (ICML), pages 784--791, New York, NY, USA, 2005. ACM. Google ScholarDigital Library
- B. Shaw and T. Jebara. Minimum volume embedding. In The 11th Conference on Artificial Intelligence and Statistics (AISTATS), volume 2, pages 460--467, 2007.Google Scholar
- G. W. Stewart and J.-g. Sun. Matrix Perturbation Theory. Academic Press, 1990.Google Scholar
- A. Talwalkar, S. Kumar, and H. A. Rowley. Large-scale manifold learning. In IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR), 2008.Google ScholarCross Ref
- J. B. Tenenbaum, V. de Silva, and J. C. Langford. A global geometric framework for nonlinear dimensionality reduction. Science, 290(5500):2319--2323, 2000.Google ScholarCross Ref
- J. Venna and S. Kaski. Neighborhood preservation in nonlinear production methods: An experimental study. In International Conference on Artificial Neural Networks (ICANN), pages 485--491, 2001. Google ScholarDigital Library
- J. Venna and S. Kaski. Local multidimensional scaling. Neural Networks, 19(6-7):889--899, 2006. Google ScholarDigital Library
- Q. Weinberger, B. D. Packer, and L. K. Saul. Nonlinear dimensionality reduction by semidefinite programming and kernel matrix factorization. In The 10th International Workshop on Artificial Intelligence and Statistics (AISTATS), pages 381--388, 2005.Google Scholar
- K. Q. Weinerger and L. K. Saul. Unsupervised learning of image manifolds by semidefinite programming. International Journal on Computer Vision, 70(1):77--90, 2006. Google ScholarDigital Library
- S. X. Yu and J. Shi. Multiclass spectral clustering. In The 9th IEEE International Conference on Computer Vision (ICCV), pages 313--319, Washington, DC, USA, 2003. IEEE Computer Society. Google ScholarDigital Library
- Y. Zhao and G. Karypis. Empirical and theoretical comparisons of selected criterion functions for document clustering. Machine Learning, 55(3):311--331, 2004. Google ScholarDigital Library
Index Terms
- Multilevel manifold learning with application to spectral clustering
Recommendations
Riemannian Manifold Learning
Recently, manifold learning has been widely exploited in pattern recognition, data analysis, and machine learning. This paper presents a novel framework, called Riemannian manifold learning (RML), based on the assumption that the input high-dimensional ...
Fusion of Manifold Learning and Spectral Clustering Algorithmwith Applications to Fault Diagnosis
ICMLC '10: Proceedings of the 2010 Second International Conference on Machine Learning and ComputingLarge amount of multivariate data in many areas of science raises the problem of data analysis and visualization. Focusing on high dimensional and nonlinear data analysis, an improved manifold learning algorithm is introduced, then a new approach is ...
Locally discriminative spectral clustering with composite manifold
A large number of data are generated in many real-world applications, e.g., photos of albums in social networks. Discovering meaningful patterns from them is desirable and still remains a big challenge. To this end, spectral clustering has established ...
Comments