Skip to main content

Dimension Reduction with Semi-supervised Pairwise Covariance-Preserving Projection

  • Conference paper
  • 1423 Accesses

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 93))

Abstract

Dimension reduction is critical in many areas of pattern classification and machine learning and many algorithms have been proposed. Pairwise Covariance-preserving Projection Method (PCPM) was an effective dimension reduction which maximizes the class discrimination and also preserves approximately the pairwise class covariances. A shortcoming of PCPM is that it can only be applied when all labels are given, thus a typical supervised method. Semi-supervised has attracted much attention in recent years since they can utilize both labeled and unlabeled data. In this paper, we extend PCPM to semi-supervised setting. The labeled data points are used to maximize the separability between different classes and the unlabeled data points are used to estimate the intrinsic geometric structure of the data. Specifically, we aim to learn a discriminant function which is as smooth as possible on the data manifold. The target optimization problem involved can be solved efficiently by eigenvalue decomposition. Experimental results on several datasets demonstrate the effectiveness of our method.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Campbell, N.: Canonical Variate Analysis-a General Formulation. Australian Journal of Statistics 26, 86–96 (1984)

    Article  MATH  MathSciNet  Google Scholar 

  2. Dudoit, S., Fridlyand, J., Speed, T.P.: Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data. Journal of the American Statistical Association 97(457), 77–88 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  3. Belhumeur, P., Hespanha, J., Kriegman, D.: Face Recognition: Eigenfaces vs. Fisherfaces: Recognition Using Class Specific Projection. IEEE Trans. Pattern Analysis and Machine Intelligence 19(7) (1997)

    Google Scholar 

  4. Ye, J.P., Ravi janardan, T.X.: CPM: A Covariance-preserving Projection Method. In: Proceedings of the Sixth SIAM International Conference on Data Mining. SIAM, Bethesda (2006)

    Google Scholar 

  5. Gui, J., Wang, C., Zhu, L.: Locality Preserving Discriminant Projections. In: Huang, D.-S., Jo, K.-H., Lee, H.-H., Kang, H.-J., Bevilacqua, V. (eds.) ICIC 2009. LNCS, vol. 5755, pp. 566–572. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  6. Wang, F., Zhang, C.: Label Propagation Through Linear Neighborhoods. IEEE Transactions on Knowledge and Data Engineering 20(1), 55–67 (2008)

    Article  Google Scholar 

  7. Liu, X., Wang, Z., Feng, Z., Tang, J.: A Pairwise Covariance-preserving Projection Method for Dimension Reduction. In: IEEE International Conference on Data Mining, ICDM (2007)

    Google Scholar 

  8. Vapnik, V.N.: Statistical Learning Theory. Wiley, New York (1998)

    MATH  Google Scholar 

  9. Gui, J., Huang, D.-S., You, Z.: An Improvement on Learning with Local and Global Consistency. In: ICPR, Tampa, USA, pp. 1–4, 8–11 (2008)

    Google Scholar 

  10. Zhou, D., et al.: Learning with Local and Global Consistency. In: Advances in Neural Information Processing Systems (2003)

    Google Scholar 

  11. Belkin, M., Niyogi, P., Sindhwani, V.: Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples. The Journal of Machine Learning Research, 2399–2434 (2007)

    Google Scholar 

  12. Liu, X., Tang, J., Liu, J., Feng, Z.: A Semi-supervised Relief Based Feature Extraction Algorithm. In: International Symposium on Signal Processing. Image Processing and Pattern Recognition, SIP (2008)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Liu, X., Wang, Z., Liu, J., Feng, Z. (2010). Dimension Reduction with Semi-supervised Pairwise Covariance-Preserving Projection. In: Huang, DS., McGinnity, M., Heutte, L., Zhang, XP. (eds) Advanced Intelligent Computing Theories and Applications. ICIC 2010. Communications in Computer and Information Science, vol 93. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-14831-6_73

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-14831-6_73

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-14830-9

  • Online ISBN: 978-3-642-14831-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics