Skip to main content
Log in

Robust Spectral Subspace Clustering Based on Least Square Regression

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

In recent years, graph based subspace clustering has attracted considerable attentions in computer vision, as its capability of clustering data efficiently. However, the graph weights built by using representation coefficients are not the exact ones as the traditional definition. That is, the two steps are conducted in independent manner such that an overall optimal result cannot be guaranteed. To this end, in this paper, a novel subspace clustering via learning an adaptive graph affinity matrix is proposed, where the soft label and the representation coefficients of data are learned in an unified framework. First, the proposed method learns a robust representation for the data through least square regression, which reveals the subspace structure within data and captures various noises inside. Second, the segmentation is sought by conducting spectral clustering simultaneously. Most importantly, during the optimization process, the segmentation is utilized to iteratively enhance the block-diagonal structure of the learned representation to further assist the clustering process. Experimental results on several famous databases demonstrate that the proposed method performs better against the state-of-the-art approaches, in clustering.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. c is the number of cluster.

  2. The entropy \(\text {H}(K)\) is defined as \(\text {H}(K)= \sum \nolimits _{y \in {K}}{- p(y)\cdot \log _2(p(y))}\) where p(y) is the probability that a sample belongs to a cluster.

References

  1. Chung FRK (1997) Spectral graph theory. American Mathematical Society, Providence

    MATH  Google Scholar 

  2. Costeira J, Paulo O, Kanade T (1998) A multibody factorization method for independently moving objects. Int J Comput Vis 29(3):159–179

    Article  Google Scholar 

  3. Elhamifar E, Vidal R (2013) Sparse subspace clustering: algorithm, theory, and applications. IEEE Trans Pattern Anal Mach Intell 35(11):2765–2781

    Article  Google Scholar 

  4. Fang X, Xu Y, Li X, Lai Z, KeungWong W (2015) Learning a nonnegative sparse graph for linear regression. IEEE Trans Image Process 24(9):2760–2771

    Article  MathSciNet  Google Scholar 

  5. Guo X (2015) Robust subspace segmentation by simultaneously learning data representations and their affinity matrix. In: Proceedings of IJCAI, pp 3547–3553

  6. Lai Z, Xu Y, Chen Q, Yang J, Zhang D (2014) Multilinear sparse principal component analysis. IEEE Trans Neural Netw Learn Syst 25(10):1942–1950

    Article  Google Scholar 

  7. Li K, Yang J, Jiang J (2014) Nonrigid structure from motion via sparse representation. IEEE Trans Cybern 45(8):1401–1413

    Google Scholar 

  8. Lin Z, Liu R, Li H (2015) Linearized alternating direction method with parallel splitting and adaptive penalty for separable convex programs in machine learning. Mach Learn 99(2):287–325

    Article  MathSciNet  Google Scholar 

  9. Liu G, Lin Z, Yu Y (2010) Robust subspace segmentation by low-rank representation. In Proceedings of the 27th international conference on machine learning (ICML-10), pp 663–670

  10. Lu CY, Min H, Zhao ZQ, Zhu L, Huang DS, Yan S (2012) Robust and efficient subspace segmentation via least squares regression. In: European conference on computer vision, pp 347–360

  11. Von Luxburg U (2007) A tutorial on spectral clustering. Stat Comput 17(4):395–416

    Article  MathSciNet  Google Scholar 

  12. Ma Y, Derksen H, Hong W, Wright J (2007) Segmentation of multivariate mixed data via lossy data coding and compression. IEEE Trans Pattern Anal Mach Intell 29(9):1546–1562

    Article  Google Scholar 

  13. Nie F, Wang X, Huang H (2014) Clustering and projected clustering with adaptive neighbors. In: Proceedings of the 20th ACM SIGKDD international conference on knowledge discovery and data mining, pp 977–986

  14. Parsons L, Haque E, Liu H (2004) Subspace clustering for high dimensional data: a review. SIGKDD Explor Newsl 6(1):90–105

    Article  Google Scholar 

  15. Peng X, Yi Z, Tang H (2015) Robust subspace clustering via thresholding ridge regression. In: Proceedings of AAAI, AAAI’15, pp 3827–3833

  16. Rockafellar RT (1970) Convex analysis. Princeton University Press, Princeton

    Book  Google Scholar 

  17. Shen LL, Bai L, Ji Z (2011) FPCODE: an efficient approach for multi-modal biometrics. Int J Pattern Recognit Artif Intell 25(2):273–286

    Article  MathSciNet  Google Scholar 

  18. Tipping M, Bishop C (1999) Mixtures of probabilistic principal component analyzers. Neural Comput 11(2):443–482

    Article  Google Scholar 

  19. Tseng P (2000) Nearest q-flat to m points. J Optim Theory Appl 105(1):249–252

    Article  MathSciNet  Google Scholar 

  20. Vidal R, Tron R, Hartley R (2008) Multiframe motion segmentation with missing data using powerfactorization and GPCA. Int J Comput Vis 79(1):85–105

    Article  Google Scholar 

  21. Yang AY, Wright J, Ma Y, Sastry SS (2008) Unsupervised segmentation of natural images via lossy data compression. Comput Vis Image Underst 110(2):212–225

    Article  Google Scholar 

  22. Yin M, Gao J, Lin Z (2016) Laplacian regularized low-rank representation and its applications. IEEE Trans Pattern Anal Mach Intell 38(3):504–517

    Article  Google Scholar 

  23. Yin M, Gao J, Lin Z, Shi Q, Guo Y (2015) Dual graph regularized latent low-rank representation for subspace clustering. IEEE Trans Image Process 24(12):4918–4933

    Article  MathSciNet  Google Scholar 

  24. Zhang J, Han Y, Jiang J (2015) Tensor rank selection for multimedia analysis. J Vis Commun Image Represent 30:376–392

    Article  Google Scholar 

  25. Zhang T, Szlam A, Lerman G (2009) Median k-flats for hybrid linear modeling with many outliers. In: 2009 IEEE 12th international conference on computer vision workshops (ICCV workshops). IEEE, pp 234–241

  26. Zhu Z, Jia S, He S, Sun Y, Ji Z, Shen L (2015) Three-dimensional gabor feature extraction for hyperspectral imagery classification using a memetic framework. Inf Sci 298:274–287

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ming Yin.

Additional information

This work was supported by the National Natural Science Foundation of China Grants (61473253, 11301427, 61773130), in part by the Fundamental Research Funds for the Central Universities under No. XDJK2014B021. Ming Yin’s work was supported in part by the Guangdong Natural Science Foundation under Grant (No. 2014A030313511), in part by the Scientific Research Foundation for the Returned Overseas Chinese Scholars, State Education Ministry, China, in part by Science and Technology Planning Project of Guangdong Province (No. 2017A010101024), China, in part by Science and Technology Program of Guangzhou, China (No. 201604016086).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wu, Z., Yin, M., Zhou, Y. et al. Robust Spectral Subspace Clustering Based on Least Square Regression. Neural Process Lett 48, 1359–1372 (2018). https://doi.org/10.1007/s11063-017-9726-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-017-9726-z

Keywords

Navigation