Skip to main content
Log in

Dimensionality reduction with adaptive graph

  • Research Article
  • Published:
Frontiers of Computer Science Aims and scope Submit manuscript

Abstract

Graph-based dimensionality reduction (DR) methods have been applied successfully in many practical problems, such as face recognition, where graphs play a crucial role in modeling the data distribution or structure. However, the ideal graph is, in practice, difficult to discover. Usually, one needs to construct graph empirically according to various motivations, priors, or assumptions; this is independent of the subsequent DR mapping calculation. Different from the previous works, in this paper, we attempt to learn a graph closely linked with the DR process, and propose an algorithm called dimensionality reduction with adaptive graph (DRAG), whose idea is to, during seeking projection matrix, simultaneously learn a graph in the neighborhood of a prespecified one. Moreover, the pre-specified graph is treated as a noisy observation of the ideal one, and the square Frobenius divergence is used to measure their difference in the objective function. As a result, we achieve an elegant graph update formula which naturally fuses the original and transformed data information. In particular, the optimal graph is shown to be a weighted sum of the pre-defined graph in the original space and a new graph depending on transformed space. Empirical results on several face datasets demonstrate the effectiveness of the proposed algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Duda R O, Hart P E, Stork D G. Pattern classification. Wileyinterscience, 2012

    Google Scholar 

  2. He X F, Niyogi P. Locality preserving projections. In: Thrun S, Saul L, Schölkopf B, eds. Advances in Neural Information Processing Systems 16. Cambridge: MIT Press, 2004

    Google Scholar 

  3. He X F, Cai D, Yan S C, Zhang H J. Neighborhood preserving embedding. In: Proceedings of the 10th IEEE International Conference on Computer Vision. 2005, 1208–1213

    Google Scholar 

  4. Yan S C, Xu D, Zhang B Y, Zhang H J, Yang Q, Lin S. Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, 29(1): 40–51

    Article  Google Scholar 

  5. Gao X, Wang X, Li X, Tao D. Transfer latent variable model based on divergence analysis. Pattern Recognition, 2011, 44(10): 2358–2366

    Article  MathSciNet  MATH  Google Scholar 

  6. Wang X, Gao X, Yuan Y, Tao D, Li J. Semi-supervised gaussian process latent variable model with pairwise constraints. Neurocomputing, 2010, 73(10): 2186–2195

    Article  Google Scholar 

  7. Yan J, Zhang B, Liu N, Yan S, Cheng Q, Fan W, Yang Q, Xi W, Chen Z. Effective and efficient dimensionality reduction for large-scale and streaming data preprocessing. IEEE Transactions on Knowledge and Data Engineering, 2006, 18(3): 320–333

    Article  Google Scholar 

  8. Lee J A, Verleysen M. Nonlinear dimensionality reduction. Springer, 2007

    Book  MATH  Google Scholar 

  9. Magdalinos P. Linear and non linear dimensionality reduction for distributed knowledge discovery. Department of Informatics, Athens University of Economics and Business, 2011

    Google Scholar 

  10. Tenenbaum J B, Silva V, Langford J C. A global geometric framework for nonlinear dimensionality reduction. Science, 2000, 290(5500): 2319–2323

    Article  Google Scholar 

  11. Roweis S T, Saul L K. Nonlinear dimensionality reduction by locally linear embedding. Science, 2000, 290(5500): 2323–2326

    Article  Google Scholar 

  12. Belkin M, Niyogi P. Laplacian eigenmaps for dimensionality reduction and data representation. Neural Computation, 2003, 15(6): 1373–1396

    Article  MATH  Google Scholar 

  13. Van-der-Maaten L, Postma E, Van-den-Herik H. Dimensionality reduction: a comparative review. Journal of Machine Learning Research, 2009, 10: 1–41

    Google Scholar 

  14. He X F, Yan S C, Hu Y X, Niyogi P, Zhang H J. Face recognition using laplacianfaces. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, 27(3): 328–340

    Article  Google Scholar 

  15. Yang B, Chen S. Sample-dependent graph construction with application to dimensionality reduction. Neurocomputing, 2010, 74(1): 301–314

    Article  Google Scholar 

  16. Samko O, Marshall A, Rosin P. Selection of the optimal parameter value for the Isomap algorithm. Pattern Recognition Letters, 2006, 27(9): 968–979

    Article  Google Scholar 

  17. Carreira-Perpinán M A, Zemel R S. Proximity graphs for clustering and manifold learning. Advances in Neural Information Processing Systems, 2005, 17: 225–232

    Google Scholar 

  18. Jebara T, Wang J, Chang S F. Graph construction and b-matching for semi-supervised learning. In: Proceedings of the 26th Annual International Conference on Machine Learning. 2009, 441–448

    Google Scholar 

  19. Qiao L S, Chen S C, Tan X Y. Sparsity preserving projections with applications to face recognition. Pattern Recognition, 2010, 43(1): 331–341

    Article  MATH  Google Scholar 

  20. Yan S, Wang H. Semi-supervised learning by sparse representation. In: Proceedings of the SIAM International Conference on Data Mining (SDM2009). 2009, 792–801

    Google Scholar 

  21. Elhamifar E, Vidal R. Sparse subspace clustering. In: Proceeding of the 2009 IEEE Conference on Computer Vision and Pattern Recognition (CVPR 2009). 2009, 2790–2797

    Chapter  Google Scholar 

  22. Qiao L, Zhang L, Chen S. An empirical study of two typical locality preserving linear discriminant analysis methods. Neurocomputing, 2010, 73(10): 1587–1594

    Article  Google Scholar 

  23. Zhang L, Qiao L, Chen S. Graph-optimized locality preserving projections. Pattern Recognition, 2010, 43(6): 1993–2002

    Article  MATH  Google Scholar 

  24. Tseng P. Convergence of a block coordinate descent method for nondifferentiable minimization. Journal of Optimization Theory and Applications, 2001, 109(3): 475–494

    Article  MathSciNet  MATH  Google Scholar 

  25. Martinez A M, Kak A C. PCA versus LDA. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2001, 23(2): 228–233

    Article  Google Scholar 

  26. Lee K C, Ho J, Kriegman D J. Acquiring linear subspaces for face recognition under variable lighting. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005, 27(5): 684–698

    Article  Google Scholar 

  27. Cai D, He X F, Han J W. Semi-supervised discriminant analysis. In: Proceedings of the 11th IEEE International Conference on Computer Vision (ICCV 2007). 2007, 1–7

    Google Scholar 

  28. Wu M, Yu K, Yu S, Schölkopf B. Local learning projections. In: Proceedings of the 24th International Conference on Machine Learning. 2007, 1039–1046

    Google Scholar 

  29. Magdalinos P, Doulkeridis C, Vazirgiannis M. FEDRA: a fast and efficient dimensionality reduction algorithm. In: Proceedings of the SIAM International Conference on Data Mining (SDM2009). 2009, 509–520

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Songcan Chen.

Additional information

Lishan Qiao received his BS in mathematics from Liaocheng University (LU) in 2001. He received his MSc in applied mathematics from Chengdu University of Technology in 2004, and then worked at LU as an assistant lecturer. In 2010, he received his PhD in computer applications from Nanjing University of Aeronautics & Astronautics (NUAA). Currently he is an associate professor in the Department of Mathematics Science, LU. His research interests focus on image processing, pattern recognition, and machine learning.

Limei Zhang received her BS and MS in mathematics from Liaocheng University in 2001 and 2007, respectively. In 2010, She received her PhD in computer applications from NUAA. Currently she is an assistant professor in the Department of Mathematics Science, LU. Her research interests focus on pattern recognition and machine learning.

Songcan Chen received his BSc in mathematics from Hangzhou University (now merged into Zhejiang University) in 1983. In December 1985, he completed his MSc in computer applications at Shanghai Jiaotong University and then worked at NUAA in January 1986 as an assistant lecturer. There he received his PhD in communication and information systems in 1997. Since 1998, as a full professor, he has been with the Department of Computer Science and Engineering at NUAA. His research interests include pattern recognition, machine learning, and neural computing. In these fields, he has authored or coauthored over 160 scientific journal and conference papers.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Qiao, L., Zhang, L. & Chen, S. Dimensionality reduction with adaptive graph. Front. Comput. Sci. 7, 745–753 (2013). https://doi.org/10.1007/s11704-013-2234-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11704-013-2234-z

Keywords

Navigation