Skip to main content

Inter-class Sparsity Based Non-negative Transition Sub-space Learning

  • Conference paper
  • First Online:
Pattern Recognition and Computer Vision (PRCV 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14427))

Included in the following conference series:

  • 389 Accesses

Abstract

Least squares regression has shown promising performance in the supervised classification. However, conventional least squares regression commonly faces two limitations that severely restrict their effectiveness. Firstly, the strict zero-one label matrix utilized in least squares regression provides limited freedom for classification. Secondly, the modeling process does not fully consider the correlations among samples from the same class. To address the above issues, this paper proposes the inter-class sparsity-based non-negative transition sub-space learning (ICSN-TSL) method. Our approach exploits a transition sub-space to bridge the raw image space and the label space. By learning two distinct transformation matrices, we obtain a low-dimensional representation of the data while ensuring model flexibility. Additionally, an inter-class sparsity term is introduced to learn a more discriminative projection matrix. Experimental results on image databases demonstrate the superiority of ICSN-TSL over existing methods in terms of recognition rate. The proposed ICSN-TSL achieves a recognition rate of up to 98% in normal cases. Notably, it also achieves a classification accuracy of over 87% even on artificially corrupted images.

This work was supported in part by the Natural Science Foundation of China under Grant No. 62106052 and Grant No. 62072118, in part by the Guangdong Basic and Applied Basic Research Foundation under Grant No. 2021B1515120010, in part by the Huangpu International Sci &Tech Cooperation Fundation of Guangzhou, China, under Grant No.2021GH12.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Chen, Z., Wu, X.J., Cai, Y.H., Kittler, J.: Sparse non-negative transition subspace learning for image classification. Signal Process. 183, 107988 (2021)

    Article  Google Scholar 

  2. Chen, Z., Wu, X.J., Kittler, J.: Fisher discriminative least squares regression for image classification. arXiv preprint arXiv:1903.07833 (2019)

  3. Chen, Z., Wu, X.J., Kittler, J.: Low-rank discriminative least squares regression for image classification. Signal Process. 173, 107485 (2020)

    Article  Google Scholar 

  4. Fang, X., et al.: Approximate low-rank projection learning for feature extraction. IEEE Trans. Neural Netw. Learn. Syst. 29(11), 5228–5241 (2018)

    Article  MathSciNet  Google Scholar 

  5. Georghiades, A.S., Belhumeur, P.N.: Illumination cone models for faces recognition under variable lighting. In: Proceedings of CVPR 1998 (1998)

    Google Scholar 

  6. Han, N., et al.: Double relaxed regression for image classification. IEEE Trans. Circuits Syst. Video Technol. 30(2), 307–319 (2019)

    Article  Google Scholar 

  7. Hoerl, A.E., Kennard, R.W.: Ridge regression: biased estimation for nonorthogonal problems. Technometrics 12(1), 55–67 (1970)

    Article  Google Scholar 

  8. Martinez, A., Benavente, R.: The AR face database: CVC technical report, 24 (1998)

    Google Scholar 

  9. Meng, M., Lan, M., Yu, J., Wu, J., Tao, D.: Constrained discriminative projection learning for image classification. IEEE Trans. Image Process. 29, 186–198 (2019)

    Article  MathSciNet  Google Scholar 

  10. Nene, S.A., Nayar, S.K., Murase, H., et al.: Columbia object image library (COIL-20) (1996)

    Google Scholar 

  11. Peng, Y., Zhang, L., Liu, S., Wang, X., Guo, M.: Kernel negative \(\varepsilon \) dragging linear regression for pattern classification. Complexity 2017 (2017)

    Google Scholar 

  12. Tibshirani, R.: Regression shrinkage and selection via the lasso. J. Roy. Stat. Soc.: Ser. B (Methodol.) 58(1), 267–288 (1996)

    MathSciNet  Google Scholar 

  13. Wen, J., Xu, Y., Li, Z., Ma, Z., Xu, Y.: Inter-class sparsity based discriminative least square regression. Neural Netw. 102, 36–47 (2018)

    Article  Google Scholar 

  14. Wold, S., Ruhe, A., Wold, H., Dunn, Iii, W.: The collinearity problem in linear regression. The partial least squares (PLS) approach to generalized inverses. SIAM J. Sci. Stat. Comput. 5(3), 735–743 (1984)

    Google Scholar 

  15. Xiang, S., Nie, F., Meng, G., Pan, C., Zhang, C.: Discriminative least squares regression for multiclass classification and feature selection. IEEE Trans. Neural Netw. Learn. Syst. 23(11), 1738–1754 (2012)

    Article  Google Scholar 

  16. Zhang, X.Y., Wang, L., Xiang, S., Liu, C.L.: Retargeted least squares regression algorithm. IEEE Trans Neural Netw. Learn. Syst. 26(9), 2206–2213 (2014)

    Article  MathSciNet  Google Scholar 

  17. Zhang, Z., Lai, Z., Xu, Y., Shao, L., Wu, J., Xie, G.S.: Discriminative elastic-net regularized linear regression. IEEE Trans. Image Process. 26(3), 1466–1481 (2017)

    Article  MathSciNet  Google Scholar 

  18. Zhao, S., Wu, J., Zhang, B., Fei, L.: Low-rank inter-class sparsity based semi-flexible target least squares regression for feature representation. Pattern Recogn. 123, 108346 (2022)

    Article  Google Scholar 

  19. Zhao, S., Zhang, B., Li, S.: Discriminant and sparsity based least squares regression with L1 regularization for feature representation. In: ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pp. 1504–1508. IEEE (2020)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jigang Wu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Li, M., Zhao, S., Wu, J., Ma, S. (2024). Inter-class Sparsity Based Non-negative Transition Sub-space Learning. In: Liu, Q., et al. Pattern Recognition and Computer Vision. PRCV 2023. Lecture Notes in Computer Science, vol 14427. Springer, Singapore. https://doi.org/10.1007/978-981-99-8435-0_20

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-8435-0_20

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-8434-3

  • Online ISBN: 978-981-99-8435-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics