Skip to main content
Log in

Multi-dictionary induced low-rank representation with multi-manifold regularization

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Low-rank representation (LRR) is a very competitive technique in many real-world applications for its robustness on processing noisy or corrupted data. In this paper, a multi-dictionary induced LRR method (MDLRR) is proposed. Different from traditional LRR methods with each data point being treated equally, in MDLRR, the importance of each data point is scaled up or down independently by a penalty factor. This penalty factor of the data point is the sum distance between it and rest ones, and is considered as a metric of the probability that the data point is an outlier. And these factors form a penalty dictionary which is imposed on a dataset to achieve better low rank structure with clean data being promoted. To learn common view-free low-rank structure in multi-view datasets, multiple dictionaries are used in our proposed method. Also, a multi-manifold regularization, denoted as MDLRR-MM, is adopted for keeping multiple manifolds in learning multi-view low rank data representation. Thus, our MDLRR-MM can benefit from both learning multiple local manifolds and global low-rank subspace in multi-view datasets. Extensive experimental results on a variety of applications, including background modeling from video, face recognition, and denoising from multi-view images, show that MDLRR-MM significantly outperforms several state-of-the-art low rank methods, in subspace clustering and classification with data recovery from multi-view noisy data, and it also presents its robustness in the moderate noisy scenarios.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. http://vision.uscd.edu/leekc/ExtYaleDatabase/ExtYaleB.html.

  2. http://www.cs.cmu.edu/afs/cs/project/PIE/MultiPie/Multi-Pie/Home.html

  3. https://www.cs.columbia.edu/CAVE/software/softlib/coil-20.php

References

  1. Zeng X, Hu R, Shi W, Qiao Y (2021) Multi-view self-supervised learning for 3d facial texture reconstruction from single image. Image and Vision Computing 115:104311

    Article  Google Scholar 

  2. Wang Y, Zhang W, Wu L, Lin X, Zhao X (2017) Unsupervised metric fusion over multiview data by graph random walk-based cross-view diffusion. IEEE Transactions on Neural Networks & Learning Systems 28(1):57–70

    Article  Google Scholar 

  3. Wang Y, Lin X, Wu L, Zhang W, Zhang Q, Huang X (2015) Robust subspace clustering for multi-view data by exploiting correlation consensus. IEEE Transactions on Image Processing A Publication of the IEEE Signal Processing Society 24(11):3939–49

    Article  MathSciNet  MATH  Google Scholar 

  4. Li N, Wen L, Dong X, Cai J (2018) An exemplar-based multi-view domain generalization framework for visual recognition. IEEE Transactions on Neural Networks & Learning Systems 29(2):259–272

    Article  MathSciNet  Google Scholar 

  5. Xiao Q, Dai J, Luo J, Fujita H (2019) Multi-view manifold regularized learning-based method for prioritizing candidate disease mirnas. Knowl-Based Syst 175:118–129

    Article  Google Scholar 

  6. Zhang C, Fu H, Hu Q, Cao X, Xie Y, Tao D, Xu D (2018) Generalized latent multi-view subspace clustering. IEEE Trans Pattern Anal Mach Intell, 1–1

  7. Zhang G-Y, Zhou Y-R, Wang C-D, Huang D, He X-Y (2021) Joint representation learning for multi-view subspace clustering. Expert Systems with Applications 166:113913

    Article  Google Scholar 

  8. Rupnik J, Shawe-Taylor J (2010) Multi-view canonical correlation analysis. In: Conference on Data Mining and Data Warehouses (SiKDD 2010), pp 1–4

  9. Candès EJ, Li X, Ma Y, Wright J (2011) Robust principal component analysis?. Journal of the ACM (JACM) 58(3):1–37

    Article  MathSciNet  MATH  Google Scholar 

  10. Liu G, Lin Z, Yan S, Sun J, Yu Y, Ma Y (2013) Robust recovery of subspace structures by low-rank representation. IEEE Transactions on Pattern Analysis & Machine Intelligence 35(1):171–184

    Article  Google Scholar 

  11. Peng Y, Lu BL, Wang S (2015) Enhanced low-rank representation via sparse manifold adaption for semi-supervised learning. Neural Netw 65(C):1–17

    Article  MATH  Google Scholar 

  12. Deng T, Ye D, Ma R, Fujita H, Xiong L (2020) Low-rank local tangent space embedding for subspace clustering. Inf Sci 508:1–21

    Article  MathSciNet  MATH  Google Scholar 

  13. Du S, Ma Y, Ma Y (2017) Graph regularized compact low rank representation for subspace clustering. Knowl-Based Syst 118:56–69

    Article  Google Scholar 

  14. Lu X, Wang Y, Yuan Y (2013) Graph-regularized low-rank representation for destriping of hyperspectral images. IEEE Transactions on Geoscience & Remote Sensing 51(7):4009–4018

    Article  Google Scholar 

  15. Liu R, Lin Z, De la Torre F, Su Z (2012) Fixed-rank representation for unsupervised visual learning. In: 2012 IEEE Conference on Computer Vision and Pattern Recognition. IEEE, pp 598–605

  16. Wei L, Wang X, Yin J, Wu A (2017) Self-regularized fixed-rank representation for subspace segmentation. Inf Sci 412:194–209

    Article  MathSciNet  MATH  Google Scholar 

  17. Bao BK, Liu G, Xu C, Yan S (2012) Inductive robust principal component analysis. IEEE Trans Image Process 21(8):3794–3800

    Article  MathSciNet  MATH  Google Scholar 

  18. Yang X, Jiang X, Tian C, Wang P, Zhou F, Fujita H (2020) Inverse projection group sparse representation for tumor classification: A low rank variation dictionary approach. Knowledge-Based Systems 196:105768

    Article  Google Scholar 

  19. Wang Q, He X, Li X (2018) Locality and structure regularized low rank representation for hyperspectral image classification. IEEE Trans Geosci Remote Sens 57(2):911–923

    Article  Google Scholar 

  20. Wright J, Ganesh A, Rao S, Ma Y (2009) Robust principal component analysis. Exact recovery of corrupted low-rank matrices, 58 1(3):289–298

    Google Scholar 

  21. Peng Y, Ganesh A, Wright J, Xu W, Ma Y (2012) Rasl: Robust alignment by sparse and low-rank decomposition for linearly correlated images. IEEE Transactions on Pattern Analysis & Machine Intelligence 34(11):2233–46

    Article  Google Scholar 

  22. Wang Y, Peng J, Zhao Q, Leung Y, Zhao X. -L., Meng D (2017) Hyperspectral image restoration via total variation regularized low-rank tensor decomposition. IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 11(4):1227–1243

    Article  Google Scholar 

  23. Tao Z, Tang Z (2017) Improved algorithm based on non-negative low rank and sparse graph for semi-supervised learning. Journal of Electronics & Information Technology

  24. Liu G, Yan S (2011) Latent low-rank representation for subspace segmentation and feature extraction. In: International Conference on Computer Vision, pp 1615–1622

  25. Wang H, Li T, Li T, Yang Y (2014) Constraint neighborhood projections for semi-supervised clustering. IEEE Trans Cybern 44(5):636–643

    Article  Google Scholar 

  26. Peng X, Yan R, Zhao B, Tang H, Yi Z (2015) Fast low rank representation based spatial pyramid matching for image classification. Knowl-Based Syst 90(C):14–22

    Article  Google Scholar 

  27. Jiang X, Lai J (2015) Sparse and dense hybrid representation via dictionary decomposition for face recognition. IEEE Transactions on Pattern Analysis & Machine Intelligence 37(5):1067

    Article  Google Scholar 

  28. Li L, Li S, Fu Y (2014) Learning low-rank and discriminative dictionary for image classification. Image Vis Comput 32(10):814–823

    Article  Google Scholar 

  29. Belkin M, Niyogi P (2001) Laplacian eigenmaps and spectral techniques for embedding and clustering. Advances in Neural Information Processing Systems 14(6):585–591

    Google Scholar 

  30. Yan S, Xu D, Zhang B, Zhang HJ, Yang Q, Lin S (2007) Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Transactions on Pattern Analysis & Machine Intelligence 29(1):40

    Article  Google Scholar 

  31. He X, Yan S, Hu Y, Niyogi P, Zhang HJ (2005) Face recognition using laplacianfaces. In: IEEE Transactions on Pattern Analysis and Machine Intelligence, pp 328–340

  32. Cai D, He X, Zhou K, Han J, Bao H (2007) Locality sensitive discriminant analysis. In: International Joint Conference on Artifical Intelligence, pp 708–713

  33. Pang Y, Zhang L, Liu Z, Yu N, Li H (2005) Neighborhood preserving projections (npp): a novel linear dimension reduction method. In: International Conference on Advances in Intelligent Computing, pp 117–125

  34. Lishan Q, Songcan C, Xiaoyang T (2010) Sparsity preserving projections with applications to face recognition. Pattern Recogn 43(1):331–341

    Article  MATH  Google Scholar 

  35. Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326

    Article  Google Scholar 

  36. Tenenbaum JB, Silva VD, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290(5500):2319–2323

    Article  Google Scholar 

  37. Belkin M, Niyogi P (2003) Laplacian Eigenmaps for dimensionality reduction and data representation. MIT Press

  38. Donoho DL, Grimes C (2003) Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data. Proc Natl Acad Sci U S A 100(10):5591–5596

    Article  MathSciNet  MATH  Google Scholar 

  39. Zhang Z, Zha H (2003) Nonlinear dimension reduction via local tangent space alignment. 2690(2690):477–481

  40. Zhang T, Tao D, Li X, Yang J (2009) Patch alignment for dimensionality reduction. IEEE Transactions on Knowledge & Data Engineering 21(9):1299–1313

    Article  Google Scholar 

  41. Zhang Z, Yan S, Zhao M (2014) Similarity preserving low-rank representation for enhanced data representation and effective subspace learning. Neural Netw 53:81–94

    Article  MATH  Google Scholar 

  42. Yin M, Gao J, Lin Z (2016) Laplacian regularized low-rank representation and its applications. IEEE Transactions on Pattern Analysis & Machine Intelligence 38(3):504–517

    Article  Google Scholar 

  43. Liu J, Chen Y, Zhang J, Xu Z (2014) Enhancing low-rank subspace clustering by manifold regularization. IEEE Trans Image Process 23(9):4022–4030

    Article  MathSciNet  MATH  Google Scholar 

  44. Li B, Lu C, Wen Z, Leng C, Liu X (2017) Locality-constrained nonnegative robust shape interaction subspace clustering and its applications. Digital Signal Processing 60:113–121

    Article  Google Scholar 

  45. Yang S, Feng Z, Ren Y, Liu H, Jiao L (2014) Semi-supervised classification via kernel low-rank representation graph. Knowl-Based Syst 69(1):150–158

    Article  Google Scholar 

  46. Zhuang L, Gao S, Tang J, Wang J, Lin Z, Ma Y, Yu N (2015) Constructing a nonnegative low-rank and sparse graph with data-adaptive features. IEEE Trans Image Process 24(11):3717–3728

    Article  MathSciNet  MATH  Google Scholar 

  47. Xu W, Gong Y (2004) Document clustering by concept factorization, 202–209

  48. Lin Z, Chen M, Wu L, Ma Y (2010) The augmented lagrange multiplier method for exact recovery of corrupted low-rank matrices, Eprint Arxiv 9

  49. Cai JF, Cand EJS, Shen Z (2008) A singular value thresholding algorithm for matrix completion. Siam Journal on Optimization 20(4):1956–1982

    Article  MathSciNet  Google Scholar 

  50. Ji S, Ye J (2009) An accelerated gradient method for trace norm minimization. In: International Conference on Machine Learning, pp 457–464

  51. Yang J, Yin W, Zhang Y, Wang Y (2009) A fast algorithm for edge-preserving variational multichannel image restoration. Siam Journal on Imaging Sciences 2(2):569–592

    Article  MathSciNet  MATH  Google Scholar 

  52. Liu G, Lin Z, Yu Y (2010) Robust subspace segmentation by low-rank representation. In: International Conference on Machine Learning, pp 663–670

  53. Guo K, Xu X, Tao D (2017) Discriminative godec+ for classification. IEEE Transactions on Signal Processing 65(13):3414–3429

    Article  MathSciNet  MATH  Google Scholar 

  54. Zheng Y, Zhang X, Yang S, Jiao L (2013) Low-rank representation with local constraint for graph construction. Neurocomputing 122(122):398–405

    Article  Google Scholar 

Download references

Acknowledgements

This work was funded in part by the National Natural Science Foundation of China(No.61572240) and Science and Technology Planning Social Development Project of Zhenjiang City (SH2021006).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qian Zhu.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhou, J., Shen, X., Liu, S. et al. Multi-dictionary induced low-rank representation with multi-manifold regularization. Appl Intell 53, 3576–3593 (2023). https://doi.org/10.1007/s10489-022-03446-y

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-03446-y

Keywords

Navigation