Skip to main content
Log in

Maximum inter-class and marginal discriminant embedding (MIMDE) for feature extraction and classification

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

This paper develops a supervised discriminant technique based on graph embedding, called maximum inter-class and marginal discriminant embedding (MIMDE) for feature and classification in high-dimensional data space. In graph embedding, our objective is to find a linear transform matrix to make the samples in the same class as compact as possible and the samples belong to the different classes as dispersed as possible through local property (intra-class compactness graph and inter-class separability graph) and marginal property (margin separability graph). This characteristic makes MIMDE more intuitive and more powerful than marginal Fisher analysis which not considered inter-class separability graph. Experimental results on ORL and AR face databases show the effectiveness of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Jain AK, Duin RPW, Mao J (2000) Statistical pattern recognition: a review. IEEE Trans Pattern Anal Mach Intell 22(1):4–37

    Article  Google Scholar 

  2. Joliffe I (1986) Principal component analysis. Springer, New York

  3. Ye J, Janardan R, Park C, Park H (2004) An optimization criterion for generalized discriminant analysis on undersampled problems. IEEE Trans Pattern Anal Mach Intell 26(8):982–994

    Google Scholar 

  4. Martinez AM, Kak AC (2001) PCA versus LDA. IEEE Trans Pattern Anal Mach Intell 23(2):228–233

    Article  Google Scholar 

  5. Li H, Jiang T, Zhang K (2006) Efficient and robust feature extraction by maximum margin criterion. IEEE Trans Neural Netw 17(1):1157–1165

    Article  Google Scholar 

  6. Howland P, Wang J, Park H (2006) Solving the small sample size problem in face recognition using generalized discriminant analysis. Pattern Recognit 39:277–287

    Article  Google Scholar 

  7. Zheng W, Zhao L, Zou C (2005) Foley–Sammon optimal discriminant vectors using kernel approach. IEEE Trans Neural Netw 16(1):1–9

    Article  Google Scholar 

  8. Zheng W, Zhao L, Zou C (2004) An efficient algorithm to solve the small sample size problem for LDA. Pattern Recognit 37:1077–1079

    Article  MATH  Google Scholar 

  9. Friedman JH (1989) Regularized discriminant analysis. J Am Stat Assoc 84(405):165–175

    Article  Google Scholar 

  10. Golub GH, Van Loan CF (1996) Matrix computations. The Johns Hopkins University Press, Baltimore

    MATH  Google Scholar 

  11. Feng G, Hu D, Zhou Z (2008) A direct locality preserving projections (DLPP) algorithm for image recognition. Neural Process Lett 27:247–255

    Article  Google Scholar 

  12. Swets DL, Weng J (1996) Using discriminant eigenfeatures for image retrieval. IEEE Trans Pattern Anal Mach Intell 18(8):831–836

    Article  Google Scholar 

  13. Howland P, Jeon M, Park H (2003) Structure preserving dimension reduction for clustered text data based on the generalized singular value decomposition. SIAM J Matrix Anal Appl 25(1):165–179

    Article  MathSciNet  MATH  Google Scholar 

  14. Ye J, Janardan R, Park CH, Park H (2004) An optimization criterion for generalized discriminant analysis on undersampled problems. IEEE Trans Pattern Anal Mach Intell 26(8):982–994

    Article  Google Scholar 

  15. Ye J, Li Q (2005) A two-stage linear discriminant analysis via QR-decomposition. IEEE Trans Pattern Anal Mach Intell 27(6):929–941

    Article  Google Scholar 

  16. Chen L-F, Hong-Yuan X, Liao M, Ko M-T, Lin J-C, Yu G-J (2000) A new LDA-based face recognition system which can solve the small sample size problem. Pattern Recognit 33:1713–1726

    Article  Google Scholar 

  17. Tenenbaum JB, de Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290:2319–2323

    Article  Google Scholar 

  18. Roweis ST, Saul LK (2000) Nonlinear dimension reduction by locally linear embedding. Science 290:2323–2326

    Article  Google Scholar 

  19. Belkin M, Niyogi P (2003) Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput 15(6):1373–1396

    Article  MATH  Google Scholar 

  20. He X, Yan S, Hu Y, Zhang H (2003) Learning a locality preserving subspace for visual recognition. In: Proceedings of the 9th international conference on computer vision. France, October 2003, pp 385–392

  21. He X, Yan S, Hu Y, Niyogi P, Zhang H (2005) Face recognition using Laplacian faces. IEEE Trans Pattern Anal Mach Intell 27(3):328–340

    Article  Google Scholar 

  22. Yang J, Zhang D, Yang JY, Niu B (2007) Globally maximizing, locally minimizing: unsupervised discriminant projection with applications to face and palm biometrics. IEEE Trans Pattern Anal Mach Intell 29(4):650–664

    Article  Google Scholar 

  23. Yan S, Xu D, Zhang B, Zhang H-J (2005) Graph embedding: a general framework for dimensionality reduction. In: Proceedings of the IEEE conference on computer vision and pattern recognition, 2005. pp 830–837

  24. Yan S, Xu D, Zhang B, Zhang H-J (2007) Graph embedding and extensions: a general framework for dimensionality reduction. IEEE Trans Pattern Anal Mach Intell 29(1):40–51

    Article  MathSciNet  Google Scholar 

  25. Chen HT, Chang HW, Liu TL (2005) Local discriminant embedding and its variants. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR2005), vol 2, 2005. pp 846–853

  26. Tenenbaum JB, de Silva V, Langford JC (2000) A global geometric framework for nonlinear dimensionality reduction. Science 290(5500):2319–2323

    Article  Google Scholar 

  27. Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326

    Article  Google Scholar 

  28. Saul LK, Roweis ST (2003) Think globally, fit locally: unsupervised learning of low dimensional manifolds. J Mach Learn Res 4:119–155

    MathSciNet  Google Scholar 

  29. Belkin M, Niyogi P (2001) Laplacian eigenmaps and spectral techniques for embedding and clustering. In: Advances in Neural Information Processing Systems, vol 15. Vancouver, British Columbia, Canada

  30. Belkin M, Niyogi P (2003) Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput 15(6):1373–1396

    Article  MATH  Google Scholar 

  31. Zhang Z, Zha H (2004) Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM J Sci Comput 26(1):313–338

    Article  MathSciNet  MATH  Google Scholar 

  32. He X, Niyogi P (2003) Locality preserving projections. In: Proceedings of the 17th annual conference on neural information processing systems. Vancouver and Whistler, Canada, December 8–13, 2003

  33. He X, Yan S, Hu Y, Niyogi P, Zhang H-J (2005) Face recognition using Laplacianfaces. IEEE Trans Pattern Anal Mach Intell 27(3):328–340

    Article  Google Scholar 

  34. Hu H (2008) Orthogonal neighborhood preserving discriminant analysis for face recognition. Pattern Recognit 41:2045–2054

    Article  MATH  Google Scholar 

  35. Yu W, Teng X, Liu C (2006) Face recognition using discriminant locality preserving projections. Image Vis Comput 24:239–248

    Article  Google Scholar 

  36. Yang L, Gong W, Gu X, Li W, Liang Y (2008) Null space discriminant locality preserving projections for face recognition. Neurocomputing 71:3644–3649

    Article  Google Scholar 

  37. Wan M, Lai Z, Jin Z (2011) Locally minimizing embedding and globally maximizing variance: unsupervised linear difference projection for dimensionality reduction. Neural Process Lett 33(3):267–282

    Article  Google Scholar 

  38. Xu Y, Zhong A, Yang J, Zhang D (2010) LPP solution schemes for use with face recognition. Pattern Recognit 43(12):4165–4176

    Article  MATH  Google Scholar 

  39. Yang W, Sun C, Zhang L (2011) A multi-manifold discriminant analysis method for image feature extraction. Pattern Recognit 44(8):1649–1657

    Article  MATH  Google Scholar 

  40. Yang W, Wang J, Ren M, Yang J (2009) Feature extraction based on Laplacian bidirectional maximum margin criterion. Pattern Recognit 42(11):2327–2334

    Article  MATH  Google Scholar 

Download references

Acknowledgments

This work is partially supported by the National Science Foundation of China under Grant No. 60632050, 90820306, 60873151, 60973098, 61005008, and 61005005.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Minghua Wan.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Wan, M. Maximum inter-class and marginal discriminant embedding (MIMDE) for feature extraction and classification. Neural Comput & Applic 21, 1737–1743 (2012). https://doi.org/10.1007/s00521-011-0763-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-011-0763-7

Keywords

Navigation