Skip to main content
Log in

Robust image representation learning via low-rank consistency regularization for subspace clustering

  • Original Paper
  • Published:
Signal, Image and Video Processing Aims and scope Submit manuscript

Abstract

Image representation learning techniques aim to extract meaningful features from high-dimensional image data to enhance performance in downstream clustering and classification tasks. Recently, low-rank representation (LRR) methods have shown promise for uncovering the hidden low-dimensional subspace structure embedded in high-dimensional data. However, real-world data often deviates from LRR's idealized assumption that similar samples reside closely in the feature space. Specifically, data corruption can distort the spatial relationships in data, potentially misleading LRR into incorrectly interpreting corrupt samples as similar to samples from different classes if they stay close together, resulting in negative correlation and sub-optimal clustering outcomes. In this paper, we propose a novel method, which uses a low-rank consistency regularization (LCR) to overcome this limitation. LCR is introduced as a dual regularization term into the classical LRR model. The aim is to adaptively find such optimal low-rank representation that significantly minimizes the distance between similar samples in the feature space. Thus, a flexible similarity matrix is introduced simultaneously to adaptively capture an accurate similarity between samples. Unlike existing methods, this similarity matrix is employed directly for clustering by imposing a rank constraint on its Laplacian matrix. Experimental results on multiple benchmark image datasets show that our method is more efficient than state-of-the-art LRR approaches. Additionally, our method exhibits greater robustness to corruption across various experimental conditions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Algorithm 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Data availability

The datasets used in our experiments can be accessed through the following links. UCI: https://archive.ics.uci.edu/ml/datasets/Optical+Recognition+of+Handwritten+Digits. USPS: https://www.kaggle.com/bistaumanga/usps-dataset. ORL: http://cam-orl.co.uk/facedatabase.html. COIL20: https://www.cs.columbia.edu/CAVE/software/softlib/coil-20.php

Notes

  1. https://archive.ics.uci.edu/ml/datasets/Optical+Recognition+of+Handwritten+Digits

  2. https://www.kaggle.com/bistaumanga/usps-dataset

  3. http://cam-orl.co.uk/facedatabase.html

  4. https://www.cs.columbia.edu/CAVE/software/softlib/coil-20.php

References

  1. Thudumu, S., Branch, P., Jin, J., Singh, J.: A comprehensive survey of anomaly detection techniques for high dimensional big data. J. Big Data 7, 1–30 (2020)

    Google Scholar 

  2. Chen, J., Yang, S., Wang, Z., Mao, H.: Efficient sparse representation for learning with high-dimensional data. IEEE Transa. Neural Netw. Learn. Syst. 34(8), 4208–4222 (2021)

    MathSciNet  MATH  Google Scholar 

  3. Huang, S.C., Shen, L., Lungren, M. P., Yeung, S.: Gloria: A multimodal global-local representation learning framework for label-efficient medical image recognition. In: Proceedings of the IEEE/CVF International Conference on Computer Vision (2021)

  4. Wold, S., Esbensen, K., Geladi, P.: Principal component analysis. Chemom. Intell. Lab. Syst. 2(1–3), 37–52 (1987)

    MATH  Google Scholar 

  5. Balakrishnama, S., Ganapathiraju, A.: Linear discriminant analysis-a brief tutorial. Inst. Signal Inf. Proc. 18(1998), 1–8 (1998)

    Google Scholar 

  6. Hong, D., Yang, F., Fessler, J.A., Balzano, L.: Optimally weighted PCA for high-dimensional heteroscedastic data. SIAM J. Math. Data Sci. 5(1), 222–250 (2023)

    MathSciNet  MATH  Google Scholar 

  7. Sharma, O.: Deep challenges associated with deep learning. In: 2019 international conference on machine learning, big data, cloud and parallel computing (COMITCon) Faridabad, India (2019)

  8. Dong, S., Wang, P., Abbas, K.: A survey on deep learning and its applications. Computer Sci. Rev. 40, 100379 (2021)

    MathSciNet  MATH  Google Scholar 

  9. Liu, G., Lin, Z., Yu, Y.: Robust subspace segmentation by low-rank representation. In: Proceedings of the 27th international conference on machine learning (2010)

  10. Liu, G., Lin, Z., Yan, S., Sun, J., Yu, Y., Ma, Y.: Robust recovery of subspace structures by low-rank representation. IEEE Trans. Pattern Anal. Mach. Intell. 35(1), 171–184 (2012)

    MATH  Google Scholar 

  11. Bao, J., Kudo, M., Kimura, K., Sun, L.: Robust embedding regression for semi-supervised learning. Pattern Recogn. 145, 109894 (2024)

    MATH  Google Scholar 

  12. Xie, X., Guo, X., Liu, G., Wang, J.: Implicit block diagonal low-rank representation. IEEE Trans. Image Process. 27(1), 477–489 (2017)

    MathSciNet  MATH  Google Scholar 

  13. Abhadiomhen, S.E., Wang, Z., Shen, X.: Coupled low rank representation and subspace clustering. Appl. Intell. 52(1), 530–546 (2022)

    MATH  Google Scholar 

  14. Chen, H., Chen, X., Tao, H., Li, Z., Wang, X.: Low-rank representation with adaptive dimensionality reduction via manifold optimization for clustering. ACM Trans. Knowl. Discov. Data 17(9), 1–18 (2023)

    MATH  Google Scholar 

  15. Guo, T., He, L., Luo, F., Gong, X., Li, Y., Zhang, L.: Anomaly detection of hyperspectral image with hierarchical antinoise mutual-incoherence-induced low-rank representation. IEEE Trans. Geosci. Remote Sens. 61, 1–13 (2023)

    MATH  Google Scholar 

  16. Chai, L., Tu, L., Yu, X., Wang, X., Chen, J.: Link prediction and its optimization based on low-rank representation of network structures. Expert Syst. Appl. 219, 119680 (2023)

    MATH  Google Scholar 

  17. Du, S., Liu, B., Shan, G., Shi, Y., Wang, W.: Enhanced tensor low-rank representation for clustering and denoising. Knowl.-Based Syst. 243, 108468 (2022)

    MATH  Google Scholar 

  18. Shen, Q., Liang, Y., Yi, S., Zhao, J.: Fast universal low rank representation. IEEE Trans. Circuits Syst. Video Technol. 32(3), 1262–1272 (2021)

    MATH  Google Scholar 

  19. Zheng, R., Li, M., Liang, Z., Wu, F.X., Pan, Y., Wang, J.: SinNLRR: a robust subspace clustering method for cell type detection by non-negative and low-rank representation. Bioinformatics 35(19), 3642–3650 (2019)

    MATH  Google Scholar 

  20. Lu, C., Feng, J., Lin, Z., Mei, T., Yan, S.: Subspace clustering by block diagonal representation. IEEE Trans. Pattern Anal. Mach. Intell. 41(2), 487–501 (2018)

    MATH  Google Scholar 

  21. Ezeora, N.J., Anichebe, G.E., Nzeh, R.C., Uzo, I.U.: Robust subspace clustering via two-way manifold representation. Multimed. Tools Appl. (2024). https://doi.org/10.1007/s11042-024-19676-w

    Article  Google Scholar 

  22. Saeed Chilmeran, H.T., Hamed, E.T., Ahmed, H.I., Al-Bayati, A.Y.: A method of two new augmented lagrange multiplier versions for solving constrained problems. Int. J. Math. Math. Sci. 2022(1), 3527623 (2022)

    MathSciNet  MATH  Google Scholar 

  23. Li, C., Wang, C.L., Wang, J.: Convergence analysis of the augmented Lagrange multiplier algorithm for a class of matrix compressive recovery. Appl. Math. Lett. 59, 12–17 (2016)

    MathSciNet  MATH  Google Scholar 

  24. Cai, J.F., Candès, E.J., Shen, Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20(4), 1956–1982 (2010)

    MathSciNet  MATH  Google Scholar 

  25. Nie, F., Wang, X., Jordan, M., Huang, H.: The constrained laplacian rank algorithm for graph-based clustering. In Proceedings of the AAAI conference on artificial intelligence (2016)

  26. Lu, C., Feng, J., Lin, Z., Yan, S.: Correlation adaptive subspace segmentation by trace lasso. In Proceedings of the IEEE international conference on computer vision (2013).

  27. Liu, T., Lekamalage, C.K.L., Huang, G.B., Lin, Z.: An adaptive graph learning method based on dual data representations for clustering. Pattern Recogn. 77, 126–139 (2018)

    MATH  Google Scholar 

  28. Chen, J., Mao, H., Wang, Z., Zhang, X.: Low-rank representation with adaptive dictionary learning for subspace clustering. Knowl.-Based Syst. 223, 107053 (2021)

    MATH  Google Scholar 

  29. Qu, Q., Wang, Z., Chen, W.: Robust subspace clustering based on latent low-rank representation with weighted schatten-p norm minimization. In Pacific Rim International Conference on Artificial Intelligence (2022)

  30. Ding, Z., Shao, M., & Fu, Y. (2015, June). Deep low-rank coding for transfer learning. In Twenty-fourth International Joint Conference on Artificial Intelligence.

  31. Ding, Z., Fu, Y.: Deep transfer low-rank coding for cross-domain learning. IEEE Transa. Neural Netw. Learn. Syst. 30(6), 1768–1779 (2019)

    MathSciNet  MATH  Google Scholar 

  32. Abhadiomhen, S.E., Nzeh, R.C., Ganaa, E.D., Nwagwu, H.C., Okereke, G.E., Routray, S.: Supervised shallow multi-task learning: analysis of methods. Neural. Process. Lett. 54(3), 2491–2508 (2022)

    Google Scholar 

  33. Lee, K., Wu, Y., Bresler, Y.: Near-optimal compressed sensing of a class of sparse low-rank matrices via sparse power factorization. IEEE Trans. Inf. Theory 64(3), 1666–1698 (2017)

    MathSciNet  MATH  Google Scholar 

Download references

Funding

This work was not funded by any organization.

Author information

Authors and Affiliations

Authors

Contributions

Conceptualisation, S.E.A and G.E.O; Methodology, S.E.A, G.E.O and R.C.N; Software, S.E.A and G.E.O; Supervision, N.J.E; Writing‐original draft, S.E.A and G.E.O; Writing‐review & editing, A.O.A, C.N.A

Corresponding author

Correspondence to Nnamdi Johnson Ezeora.

Ethics declarations

Conflict of interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Abhadiomhen, S.E., Okereke, G.E., Nzeh, R.C. et al. Robust image representation learning via low-rank consistency regularization for subspace clustering. SIViP 19, 295 (2025). https://doi.org/10.1007/s11760-025-03869-3

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11760-025-03869-3

Keywords