Skip to main content

Robust Sparse PCA via Weighted Elastic Net

  • Conference paper

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 321))

Abstract

In principal component analysis (PCA), ℓ2 /ℓ1-norm is widely used to measure coding residual. In this case, it assume that the residual follows Gaussian/Laplacian distribution. However, it may fail to describe the coding errors in practice when there are outliers. Toward this end, this paper propose a Robust Sparse PCA (RSPCA) approach to solve the outlier problem, by modeling the sparse coding as a sparsity-constrained weighted regression problem. By using a series of equivalent transformations, we show the proposed RSPCA is equivalent to the Weighted Elastic Net (WEN) problem and thus the Least Angle Regression Elastic Net (LARS-EN) algorithm is used to yield the optimal solution. Simulation results illustrated the effectiveness of this approach.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. Journal of Computational and Graphical Statistics 15, 265–286 (2006)

    Article  MathSciNet  Google Scholar 

  2. Zhou, T., Tao, D., Wu, X.: Manifold elastic net: a unified framework for sparse dimension reduction. In: Data Mining and Knowledge Discovery, vol. 22, pp. 340–371 (2011)

    Google Scholar 

  3. Jolliffe, I.T.: Principal component analysis. Wiley Online Library (2002)

    Google Scholar 

  4. Wold, S., Esbensen, K., Geladi, P.: Principal component analysis. Chemometrics and Intelligent Laboratory Systems 2, 37–52 (1987)

    Article  Google Scholar 

  5. d’Aspremont, A., El Ghaoui, L., Jordan, M.I., Lanckriet, G.R.G.: A direct formulation for sparse PCA using semidefinite programming. Computer Science Division (2004)

    Google Scholar 

  6. Moghaddam, B., Weiss, Y., Avidan, S.: Spectral bounds for sparse PCA: Exact and greedy algorithms. In: Advances in Neural Information Processing Systems, vol. 18 (2006)

    Google Scholar 

  7. Shen, H., Huang, J.Z.: Sparse principal component analysis via regularized low rank matrix approximation. Journal of Multivariate Analysis 99, 1015–1034 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  8. Mackey, L.: Deflation methods for sparse PCA. In: Advances in Neural Information Processing Systems, vol. 21, pp. 1017–1024 (2009)

    Google Scholar 

  9. Frieze, A., Kannan, R., Vempala, S.: Fast Monte-Carlo algorithms for finding low-rank approximations. Journal of ACM 51, 1025–1041 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  10. Meng, D., Zhao, Q., Xu, Z.: Robust sparse principal component analysis. Preprint (2010)

    Google Scholar 

  11. Ding, C., Zhou, D., He, X., Zha, H.: R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization. In: ICML (2006)

    Google Scholar 

  12. Ke, Q., Kanade, T.: Robust l1 norm factorization in the presence of outliers and missing data by alternative convex programming. In: IEEE CVPR (2005)

    Google Scholar 

  13. Kwak, N.: Principal component analysis based on L1-norm maximization. IEEE TPAMI 30, 1672–1680 (2008)

    Article  Google Scholar 

  14. De La Torre, F., Black, M.J.: Robust principal component analysis for computer vision. In: IEEE ICCV (2001)

    Google Scholar 

  15. De La Torre, F., Black, M.J.: A framework for robust subspace learning. International Journal of Computer Vision 54, 117–142 (2003)

    Article  MATH  Google Scholar 

  16. Aanæs, H., Fisker, R., Astrom, K., Carstensen, J.M.: Robust factorization. IEEE Transactions on PAMI 24, 1215–1225 (2002)

    Article  Google Scholar 

  17. Candes, E.J., Li, X., Ma, Y., Wright, J.: Robust principal component analysis? Arxiv preprint ArXiv: 0912.3599 (2009)

    Google Scholar 

  18. Croux, C., Filzmoser, P., Fritz, H.: Robust sparse principal component analysis. Catholic University of Leuven Department of Decision Science and Information Management Working Paper (2011)

    Google Scholar 

  19. Zou, H., Hastie, T.: Regression shrinkage and selection via the elastic net, with applications to microarrays. Journal of Royal Statist Society B, 1–26 (2003)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Wang, L., Cheng, H. (2012). Robust Sparse PCA via Weighted Elastic Net. In: Liu, CL., Zhang, C., Wang, L. (eds) Pattern Recognition. CCPR 2012. Communications in Computer and Information Science, vol 321. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33506-8_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33506-8_12

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33505-1

  • Online ISBN: 978-3-642-33506-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics