Skip to main content
Log in

Learning Linear and Nonlinear PCA with Linear Programming

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

An SVM-like framework provides a novel way to learn linear principal component analysis (PCA). Actually it is a weighted PCA and leads to a semi-definite optimization problem (SDP). In this paper, we learn linear and nonlinear PCA with linear programming problems, which are easy to be solved and can obtain the unique global solution. Moreover, two algorithms for learning linear and nonlinear PCA are constructed, and all principal components can be obtained. To verify the performance of the proposed method, a series of experiments on artificial datasets and UCI benchmark datasets are accomplished. Simulation results demonstrate that the proposed method can compete with or outperform the standard PCA and kernel PCA (KPCA) in generalization ability but with much less memory and time consuming.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Barzilay O, Brailovsky V (1999) On domain knowledge and feature selection using a support vector machine. Pattern Recognition Lett 20: 475–484

    Article  Google Scholar 

  2. Brown M, Grundy W, Lin D, Cristianini N, Sugnet C, Furey T, Ares M, Haussler D (2000) Knowledge-based analysis of microarray gene expression data by using support vector machines. Proc Nat Acad Sci USA 97: 262–267

    Article  Google Scholar 

  3. Croux C, Haesbroeck G (2000) Principal component analysis based on robust estimators of the covariance or correlation matrix: influence functions and efficiencies. Biometrika 87: 603–618

    Article  MathSciNet  MATH  Google Scholar 

  4. Diamantaras KI, Kung SY (1996) Principal component neural networks. Wiley, New York

    MATH  Google Scholar 

  5. Drucker H, Wu D, Vapnik V (1999) Support vector machines for spam categorization. IEEE Trans Neural Networks 10: 1048–1054

    Article  Google Scholar 

  6. Higuchi I, Eguchi S (2004) Robust principal component analysis with adaptive selection for tuning parameters. J Mach Learn Res 5: 453–471

    MathSciNet  Google Scholar 

  7. Joachims T (1998) Text categorization with support vector machines: learning with many relevant features. In: Proceedings of the European Conference on Machine Learning. Springer, Berlin, pp 137–142

  8. Jolliffe IT (1986) Principal component analysis. Springer, New York

    Google Scholar 

  9. Shawe-Taylor J, Cristianini Nello (2005) Kernel methods for pattern analysis. China Machine press, Beijing

    Google Scholar 

  10. Schölkopf B, Smolar AJ, Muller KR (1998) Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput 10: 1299–1319

    Article  Google Scholar 

  11. Suykens JAK, Vandewalle J (1999) Least squares support vector machine classifiers. Neural process Lett 9(3): 293–300

    Article  MathSciNet  Google Scholar 

  12. Suykens JAK, Van Gestel T, Vandewalle J, De Moor B (2003) A support vector machine formulation to PCA analysis and its kernel version. IEEE Trans Neural Networks 14(2): 447–450

    Article  Google Scholar 

  13. Tao Q, Wu G, Wang J (2005) A new maximum margin algorithm for one–class problems and its boosting implementation. Pattern Recognition 38: 1071–1077

    Article  MATH  Google Scholar 

  14. Tao Q, Wu G, Wang J (2007) Learning linear PCA with convex semi-definite programming. Pattern Recognition 40(10): 2633–2640

    Article  MATH  Google Scholar 

  15. Tax D, Duin R (2004) Support vector data description. Mach Learn 54: 45–66

    Article  MATH  Google Scholar 

  16. Vapnik V (1995) The nature of statistical learning theory. Springer, New York

    MATH  Google Scholar 

  17. Vapnik V (1998) Statistical learning theory. Wiley, New York

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wenjian Wang.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Zhang, R., Wang, W. Learning Linear and Nonlinear PCA with Linear Programming. Neural Process Lett 33, 151–170 (2011). https://doi.org/10.1007/s11063-011-9170-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-011-9170-4

Keywords

Navigation