Abstract
Manifold learning (ML) is a research topic of great interest in the field of machine learning that aims to determine the appropriate low-dimensional embeddings of data. The embeddings should preserve the intrinsic structure of the data manifold. Many ML techniques have been proposed to learn the underlying manifold of data. It is crucial to effectively evaluate the quality of the corresponding embedding results when selecting an appropriate ML technique in practice. However, there is a lack of effective embedding quality assessment (EQA) criteria to evaluate the embedding quality. In this paper, a new local included angles preservation (LUNA) criterion is proposed to evaluate the embedding quality. Unlike previous EQA methods that mainly focus on local neighborhood preservation performance or the preservation of the global geometric structure, the proposed LUNA criterion incorporates both an assessment of the neighborhood preserving capacity and local included angle holding performance. By introducing an effective evaluation of the performance of local included angle preservation, we show that the LUNA criterion could provide a more reasonable quality assessment than conventional criteria. The implementation of the LUNA criterion is direct and simple. To the best of our knowledge, this is the first EQA method that explicitly takes into account the preservation of local included angles. The effectiveness of the LUNA criterion is experimentally supported by its outstanding performance on a series of benchmark datasets.








Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Lee JA, Verleysen M (2007) Nonlinear dimensionality reduction. Springer, New York
Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326
Tenenbaum JB, De Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290(5500):2319–2323
Belkin M, Niyogi P (2003) Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput 15(6):1373–1396
Lv JC, Zhang Y, Kiong TK (2007) Determining of the number of principal directions in a biologically plausible pca model. IEEE Trans Neural Netw 18(3):910–916
Chen D, Lv JC, Yi Z (2014) A local non-negative pursuit method for intrinsic manifold structure preservation. In: Proceedings of the 28th AAAI conference on artificial intelligence (AAAI), Canada, pp 1745–1751
Chen D, Lv J, Yi Z (2017) Unsupervised multi-manifold clustering by learning deep representation. In: Workshops at the 31th AAAI conference on artificial intelligence (AAAI), pp 385–391
Chen D, Lv J, Yi Z (2017) Graph regularized restricted Boltzmann machine. IEEE transactions on neural networks and learning systems (TNNLS)
Chen L, Buja A (2009) Local multidimensional scaling for nonlinear dimension reduction, graph drawing, and proximity analysis. J Am Stat Assoc 104(485):209–219
Verleysen M, Lee JA (2009) Quality assessment of dimensionality reduction: rank-based criteria. Neurocomputing 7(72):1431–1443
Seung HS, Lee DD (2000) The manifold ways of perception. Science 290(5500):2268–2269
Elhamifar E, Vidal R (2011) Sparse manifold clustering and embedding. Adv Neural Inf Process Syst (NIPS) 24:55–63
Zhang ZY, Zha HY (2004) Principal manifolds and nonlinear dimension reduction via local tangent space alignment. J Shanghai Univ (Engl Ed) 8(4):406–424
Coifman RR, Lafon S (2006) Diffusion maps. Appl Comput Harmonic Anal 21(1):5–30
Lv JC, Zhang Y, Li Y (2015) Non-divergence of stochastic discrete time algorithms for pca neural networks. IEEE Trans Neural Netw Learn Syst (TNNLS) 26(2):394–399
Cao X, Zhang C, Fu H, Liu S (2015) Diversity-induced multi-view subspace clustering. In: IEEE conference on computer vision and pattern recognition (CVPR), pp 586–594
Wang W, Huang Y, Wang Y, Wang L (2014) Generalized autoencoder: a neural network framework for dimensionality reduction. In: IEEE conference on computer vision and pattern recognition workshops, pp 496–503
Xie J, Girshick R, Farhadi A (2015) Unsupervised deep embedding for clustering analysis. Computer Science
Yang J, Parikh D, Batra D (2016) Joint unsupervised learning of deep representations and image clusters, pp 5147–5156
Huang L, Jiwen L, Tan YP (2014) Multi-manifold metric learning for face recognition based on image sets. J Vis Commun Image Represent 25(7):1774–1783
Peng X, Xiao S, Feng J, Yau W-Y, Yi Z (2016) Deep subspace clustering with sparsity prior. In: The 25th international joint conference on artificial intelligence (IJCAI)
Zhen L, Yi Z, Peng X, Peng D (2014) Locally linear representation for image clustering. Electron Lett 50(13):942–943
Peng X, Lu J, Yi Z, Yan R (2016) Automatic subspace learning via principal coefficients embedding. IEEE Trans Cybern 1–14. doi:10.1109/TCYB.2016.2572306
Donoho DL, Grimes C (2003) Hessian eigenmaps: locally linear embedding techniques for high-dimensional data. Proc Nat Acad Sci 100(10):5591–5596
He X, Cai D, Yan S, Zhang H-J (2005) Neighborhood preserving embedding. IEEE Int Conf Comput Vis (ICCV) 2:1208–1213
Wright J, Yang AY, Ganesh A, Sastry SS, Ma Y (2009) Robust face recognition via sparse representation. IEEE Trans Pattern Anal Mach Intell (TPAMI) 31(2):210–227
Cheng B, Yang J, Yan S, Yun F, Huang TS (2010) Learning with \(\ell ^1\)-graph for image analysis. IEEE Trans Image Process (TIP) 19:858–866
Liu G, Lin Z, Yu Y (2010) Robust subspace segmentation by low-rank representation. In: International conference on machine learning (ICML), pp 663–670
Favaro P, Vidal R, Ravichandran A (2011) A closed form solution to robust subspace estimation and clustering, pp 1801–1807. IEEE
Satake I (1956) On a generalization of the notion of manifold. Proc Nat Acad Sci USA 42(6):359
Do Carmo MP (1992) Riemannian geometry. Springer, Berlin
Belkin M, Niyogi P, Sindhwani V (2006) Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. J Mach Learn Res 7:2399–2434
Yan S, Dong X, Zhang B, Zhang H-J, Yang Q, Lin S (2007) Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Trans Pattern Anal Mach Intell (TPAMI) 29(1):40–51
Zheng L, Tse DNC (2002) Communication on the grassmann manifold: a geometric approach to the noncoherent multiple-antenna channel. IEEE Trans Inf Theory (TIT) 48(2):359–383
Saul LK, Roweis ST (2003) Think globally, fit locally. J Mach Learn Res 4:119–155
Elgammal A, Lee C-S (2004) Inferring 3D body pose from silhouettes using activity manifold learning. In: IEEE conference on computer vision and pattern recognition (CVPR), vol 2, pp II–681. IEEE
He X, Yan S, Yuxiao H, Niyogi P, Zhang H-J (2005) Face recognition using laplacianfaces. IEEE Trans Pattern Anal Mach Intell (TPAMI) 27(3):328–340
Cai D, He X, Han J, Huang TS (2011) Graph regularized nonnegative matrix factorization for data representation. IEEE Trans Pattern Anal Mach Intell (TPAMI) 33(8):1548–1560
Goh A, Vidal R (2008) Clustering and dimensionality reduction on riemannian manifolds. In: IEEE conference on computer vision and pattern recognition (CVPR), pp 1–7. IEEE
Goldberg AB, Zhu X, Singh A, Xu Z, Nowak R (2009) Multi-manifold semi-supervised learning. In: International conference on artificial intelligence and statistics, pp 169–176
Filippone M, Masulli F, Rovetta S (2009) Clustering in the membership embedding space. Int J Knowl Eng Soft Data Paradig 1(4):363–375
Zheng W-S, Lai J-H, Yuen PC, Li SZ (2009) Perturbation LDA: learning the difference between the class empirical mean and its expectation. Pattern Recogn 42(5):764–779
Pan B, Lai J, Yuen PC (2011) Learning low-rank mercer kernels with fast-decaying spectrum. Neurocomputing 74(17):3028–3035
Zhang P, Ren Y, Zhang B (2011) A new embedding quality assessment method for manifold learning. Neurocomputing 97(1):251–266
Venna J, Kaski S (2006) Local multidimensional scaling. Neural Netw 19(6):889–899
Kaski S, Nikkilä J, Oja M, Venna J, Törönen P, Castrén E (2003) Trustworthiness and metrics in visualizing similarity of gene expression. BMC Bioinform 4(1):48
Verleysen M, Lee JA (2008) Rank-based quality assessment of nonlinear dimensionality reduction. ESANN, pp 49–54
Meng D, Leung Y, Zongben X (2011) A new quality assessment criterion for nonlinear dimensionality reduction. Neurocomputing 74(6):941–948
Gracia A, Gonzlez S, Robles V, Menasalvas E (2014) A methodology to compare dimensionality reduction algorithms in terms of loss of quality. Inf Sci 270(270):1–27
Lee JA, Verleysen M (2009) Quality assessment of dimensionality reduction: rank-based criteria. Neurocomputing 72(7):1431–1443
Siegel S (1956) Nonparametric statistics for the behavioral sciences
Wang J (2011) Geometric structure of high-dimensional data and dimensionality reduction. Springer, Beijing
Li X, Lv J, Chen D (2015) Angle-based outlier detection algorithm with more stable relationships. Springer, Berlin
Cox TF, Cox MAA (2000) Multidimensional scaling, 2nd edn. CRC Press, Boca Raton
Van Der Maaten L, Hinton G (2008) Viualizing data using t-SNE. J Mach Learn Res 9(2605):2579–2605
Bengio Y, Lamblin P, Popovici D, Larochelle H et al (2007) Greedy layer-wise training of deep networks. Adv Neural Inf Process Syst 19:153
Acknowledgements
This work was supported by the National Science Foundation of China (Grant Nos. 61375065 and 61625204), partially supported by the State Key Program of National Science Foundation of China (Grant Nos. 61432012 and 61432014).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
We declare that we have no financial and personal relationships with other people or organizations that can inappropriately influence our work, there is no professional or other personal interest of any nature or kind in any product, service and/or company that could be construed as influencing the position presented in, or the review of this manuscript.
Rights and permissions
About this article
Cite this article
Chen, D., Lv, J., Yin, J. et al. Angle-based embedding quality assessment method for manifold learning. Neural Comput & Applic 31, 839–849 (2019). https://doi.org/10.1007/s00521-017-3113-6
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-017-3113-6