Skip to main content

Neighborhood Structure Preserving Ridge Regression for Dimensionality Reduction

  • Conference paper
Pattern Recognition (CCPR 2012)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 321))

Included in the following conference series:

  • 3361 Accesses

Abstract

Recent research work shows that linear regression bears strong connections to many subspace learning methods such as linear discriminant analysis, locality preserving projection. When linear regression methods are applied for dimensionality reduction, a major disadvantage is that it fails to consider the geometric structure in the data. In this paper, we propose a graph regularized ridge regression for dimensionality reduction. We develop a new algorithm for affinity graph construction based on nonnegative least squares and use affinity graph to capture the neighborhood geometric structure information. The global and neighborhood structures information are modeled as a graph regularized least squares problem. We design an efficient model selection scheme for the optimal parameter estimation, which balances the tradeoff between the global and neighborhood structures. Extensive experimental studies are conducted on benchmark data sets to show the effectiveness of our approach.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Duda, R., Hart, P., Stork, D.: Pattern classification, vol. 2. Wiley, New York (2001)

    MATH  Google Scholar 

  2. Hastie, T., Tibshirani, R., Friedman, J.: The elements of statistical learning: data mining, inference, and prediction. Springer (2009)

    Google Scholar 

  3. Jolliffe, I.: MyiLibrary: Principal component analysis, vol. 2. Wiley Online Library (2002)

    Google Scholar 

  4. Fukunaga, K.: Introduction to statistical pattern recognition. Academic Press Professional (1990)

    Google Scholar 

  5. Belhumeur, P., Hespanha, J., Kriegman, D.: Eigenfaces vs. fisherfaces: Recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(7), 711–720 (1997)

    Article  Google Scholar 

  6. Swets, D., Weng, J.: Using discriminant eigenfeatures for image retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence 18(8), 831–836 (1996)

    Article  Google Scholar 

  7. Ye, J., Janardan, R., Park, C., Park, H.: An optimization criterion for generalized discriminant analysis on undersampled problems. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(8), 982–994 (2004)

    Article  Google Scholar 

  8. Dudoit, S., Fridlyand, J., Speed, T.: Comparison of discrimination methods for the classification of tumors using gene expression data. Journal of the American Statistical Association 97(457), 77–87 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  9. Roweis, S., Saul, L.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)

    Article  Google Scholar 

  10. Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation 15(6), 1373–1396 (2003)

    Article  MATH  Google Scholar 

  11. Tenenbaum, J., Silva, V., Langford, J.: A global geometric framework for nonlinear dimensionality reduction. Science 290(5500), 2319–2323 (2000)

    Article  Google Scholar 

  12. Belkin, M., Niyogi, P.: Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Advances in Neural Information Processing Systems, vol. 1, pp. 585–592 (2002)

    Google Scholar 

  13. Niyogi, X.: Locality preserving projections. In: Proceedings of the 2003 Conference Advances in Neural Information Processing Systems, vol. 16, p. 153. The MIT Press (2004)

    Google Scholar 

  14. Xiaofei, H., Deng, C., Shuicheng, Y., Zhang, H.: Neighborhood preserving embedding. In: Proc. of the 10th International Conference of Computer Vision, Beijing, China, pp. 1208–1213 (2005)

    Google Scholar 

  15. Ye, J.: Least squares linear discriminant analysis. In: Proceedings of the 24th International Conference on Machine Learning, pp. 1087–1093. ACM (2007)

    Google Scholar 

  16. Cai, D., He, X., Han, J.: Spectral regression: A unified approach for sparse subspace learning. In: Proc. Int. Conf. on Data Mining, ICDM 2007 (2007)

    Google Scholar 

  17. Cai, D., He, X., Han, J.: Spectral regression for dimensionality reduction. In: Computer Science Department, UIUC, UIUCDCS-R-2007-2856 (May 2007)

    Google Scholar 

  18. Björck, A.: Numerical methods for least squares problems. Society for Industrial Mathematics (1996)

    Google Scholar 

  19. Lawson, C., Hanson, R.: Solving least squares problems. Society for Industrial Mathematics (1995)

    Google Scholar 

  20. Golub, G., Van Loan, C.: Matrix computations. Johns Hopkins Univ. Press (1996)

    Google Scholar 

  21. The UCI kdd archive

    Google Scholar 

  22. Frank, A., Asuncion, A.: UCI machine learning repository (2010)

    Google Scholar 

  23. Jin, Z., Yang, J., Hu, Z., Lou, Z.: Face recognition based on the uncorrelated discriminant transformation. Pattern Recognition 34(7), 1405–1416 (2001)

    Article  MATH  Google Scholar 

  24. Jin, Z., Yang, J., Tang, Z., Hu, Z.: A theorem on the uncorrelated optimal discriminant vectors. Pattern Recognition 34(10), 2041–2047 (2001)

    Article  MATH  Google Scholar 

  25. Ye, J., Xiong, T.: Computational and theoretical analysis of null space and orthogonal linear discriminant analysis. The Journal of Machine Learning Research 7, 1183–1204 (2006)

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Shu, X., Lu, H. (2012). Neighborhood Structure Preserving Ridge Regression for Dimensionality Reduction. In: Liu, CL., Zhang, C., Wang, L. (eds) Pattern Recognition. CCPR 2012. Communications in Computer and Information Science, vol 321. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33506-8_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33506-8_4

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33505-1

  • Online ISBN: 978-3-642-33506-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics