skip to main content
10.1145/3573942.3574101acmotherconferencesArticle/Chapter ViewAbstractPublication PagesaiprConference Proceedingsconference-collections
research-article

Robust Principal Component Analysis Based on Globally-convergent Iteratively Reweighted Least Squares

Published: 16 May 2023 Publication History

Abstract

Classical Robust Principal Component Analysis (RPCA) uses the singular value threshold operator (SVT) to solve for the convex approximation of the nuclear norm with respect to the rank of a matrix. However, when the matrix size is large, the SVT operator has a slow convergent speed and high computational complexity. To solve the above problems, in this paper, we propose a Robust principal component analysis algorithm based on Global-convergent Iteratively Reweighted Least Squares (RPCA/GIRLS). In the first stage, the low-rank matrix in the original RPCA model is decomposed into two column-sparse matrix factor products, and the two matrix factors are solved via alternating iteratively reweighted least squares algorithms (AIRLS), thus reducing the computational complexity. However, since the AIRLS is sensitive to the initialization, the updated matrix factor in the first stage is used as the new input data matrix in the second stage, and the matrix factor is updated by the gradient descent step, and finally the optimal low-rank matrix that satisfies the global convergent conditions is obtained. We have conducted extensive experiments on six public video data sets, by comparing the background separation effects of these six videos and calculating their quantitative evaluation indexes, the effectiveness and superiority of the proposed algorithm are verified from both subjective and objective perspectives.

References

[1]
Candès, Emmanuel J., "Robust principal component analysis?." Journal of the ACM (JACM) , vol. 58, pp. 1-37, 2011.
[2]
Ornhag M V, Olsson C, Heyden A. Bilinear parameterization for differentiable rank-regularization [C] // Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, vol. 2, pp. 346-347, 2020.
[3]
Giampouras P V, Rontogiannis A A, Koutroumbas K D. Alternating iteratively reweighted least squares minimization for low-rank matrix factorization[J]. IEEE Transactions on Signal Processing, vol. 67, pp. 490-503, 2018.
[4]
Mukhoty B, Gopakumar G, Jain P, Globally-convergent Iteratively Reweighted Least Squares for Robust Regression Problems [C] // The 22nd International Conference on Artificial Intelligence and Statistics. PMLR, vol.8, pp. 313-322, 2018.
[5]
Giampouras P V, Rontogiannis A A, Koutroumbas K D. Robust PCA via alternating iteratively reweighted low-rank matrix factorization [C] // 2018 25th IEEE International Conference on Image Processing (ICIP). IEEE, vol. 21,pp. 3383-3387, 2018.
[6]
Wang Q, Gao Q X, Sun G, Double robust principal component analysis [J]. Neurocomputing, vol. 391, pp. 119-128, 2020.
[7]
Hu, Yao, "Fast and accurate matrix completion via truncated nuclear norm regularization." IEEE transactions on pattern analysis and machine intelligence, vol. 35, pp. 2117-2130, 2012.
[8]
Sun R, Luo Z Q. Guaranteed matrix completion via non-convex factorization [J]. IEEE Transactions on Information Theory, vol. 62, pp. 6535-6579, 2016.
[9]
Zhong X, Xu L, Li Y, A nonconvex relaxation approach for rank minimization problems [C] // Proceedings of the AAAI Conference on Artificial Intelligence, vol. 29, pp. 1980-1986, 2015.
[10]
Xie T, Li S, Sun B. Hyperspectral images denoising via nonconvex regularized low-rank and sparse matrix decomposition [J]. IEEE Transactions on Image Processing, vol. 29, pp. 44-56, 2019.
[11]
Wen F, Ying R, Liu P, Nonconvex regularized robust PCA using the proximal block coordinate descent algorithm [J]. IEEE Transactions on Signal Processing, vol. 67, pp. 5402-5416, 2019.
[12]
Kang Z, Peng C, Cheng Q. Robust PCA via nonconvex rank approximation [C] // 2015 IEEE International Conference on Data Mining. IEEE, vol. 6, pp. 211-220, 2015.
[13]
Sun Q, Xiang S, Ye J. Robust principal component analysis via capped norms [C] // Proceedings of the 19th ACM SIGKDD international conference on Knowledge discovery and data mining, vol. 31, pp. 311-319, 2013.
[14]
Bouwmans T, Zahzah E H. Robust PCA via principal component pursuit: A review for a comparative evaluation in video surveillance[J]. Computer Vision and Image Understanding, vol. 122, pp. 22-34, 2014.
[15]
Yi S, He Z, Jing X Y, Adaptive weighted sparse principal component analysis for robust unsupervised feature selection[J]. IEEE transactions on neural networks and learning systems, vol. 31, pp. 2153-2163, 2019.
[16]
Bao B K, Liu G, Xu C, Inductive robust principal component analysis [J]. IEEE transactions on image processing, vol. 21, pp. 3794-3800, 2012.
[17]
Wright J, Ganesh A, Rao S, Robust principal component analysis: Exact recovery of corrupted low-rank matrices via convex optimization[J]. Advances in neural information processing systems, vol. 22, pp. 37-46, 2009.
[18]
Sun R, Luo Z Q. Guaranteed matrix completion via non-convex factorization [J]. IEEE Transactions on Information Theory, vol. 62, pp. 6535-6579, 2016.
[19]
Wall M E, Rechtsteiner A, Rocha L M. Singular value decomposition and principal component analysis [M] // A practical approach to microarray data analysis. Springer, Boston, MA, vol. 6, pp.91-109, 2003.
[20]
Daubechies I, DeVore R, Fornasier M, Iteratively reweighted least squares minimization for sparse recovery [J]. Communications on Pure and Applied Mathematics: A Journal Issued by the Courant Institute of Mathematical Sciences, vol. 63, pp.1-38, 2010.
[21]
Lu C, Tang J, Yan S, Nonconvex nonsmooth low rank minimization via iteratively reweighted nuclear norm [J]. IEEE Transactions on Image Processing, vol. 25, pp. 829-839, 2015.
[22]
Rodriguez P, Wohlberg B. Fast principal component pursuit via alternating minimization [C] // 2013 IEEE International Conference on Image Processing. IEEE, vol. 2, pp. 69-73, 2013.
[23]
Gillis N, Glineur F. Low-rank matrix approximation with weights or missing data is NP-hard [J]. SIAM Journal on Matrix Analysis and Applications, vol. 32, pp. 1149-1165, 2011.
[24]
Recht B, Fazel M, Parrilo P A. Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization [J]. SIAM review, vol. 52, pp. 471-501, 2010.
[25]
Hastie T, Mazumder R, Lee J D, Matrix completion and low-rank SVD via fast alternating least squares [J]. The Journal of Machine Learning Research, vol. 16, pp. 3367-3402, 2015.

Index Terms

  1. Robust Principal Component Analysis Based on Globally-convergent Iteratively Reweighted Least Squares

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    AIPR '22: Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition
    September 2022
    1221 pages
    ISBN:9781450396899
    DOI:10.1145/3573942
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 16 May 2023

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Alternating iteratively reweighted least squares
    2. Background modeling
    3. Globally-convergent iteratively reweighted least squares
    4. Robust principal component analysis
    5. Singular value threshold operator

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Conference

    AIPR 2022

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 36
      Total Downloads
    • Downloads (Last 12 months)12
    • Downloads (Last 6 weeks)2
    Reflects downloads up to 01 Mar 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media