Skip to main content

A New Incremental Optimal Feature Extraction Method for On-Line Applications

  • Conference paper
Image Analysis and Recognition (ICIAR 2007)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 4633))

Included in the following conference series:

Abstract

In this paper, we introduced new adaptive learning algorithms to extract linear discriminant analysis (LDA) features from multidimensional data in order to reduce the data dimension space. For this purpose, new adaptive algorithms for the computation of the square root of the inverse covariance matrix Σ− 1/2 are introduced. The proof for the convergence of the new adaptive algorithm is given by presenting the related cost function and discussing about its initial conditions. The new adaptive algorithms are used before an adaptive principal component analysis algorithm in order to construct an adaptive multivariate multi-class LDA algorithm. Adaptive nature of the new optimal feature extraction method makes it appropriate for on-line pattern recognition applications. Both adaptive algorithms in the proposed structure are trained simultaneously, using a stream of input data. Experimental results using synthetic and real multi-class multi-dimensional sequence of data, demonstrated the effectiveness of the new adaptive feature extraction algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Fukunaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, New York (1990)

    MATH  Google Scholar 

  2. Chen, L., Liao, H.M., Ko, M., Lin, J., Yu, G.: A new LDA based face recognition system which can solve the small sample size problem. Pattern Recognition 33(10), 1713–1726 (2000)

    Article  Google Scholar 

  3. Chellappa, R., Wilson, C.: Sirohey, Human and machine recognition of faces. Proc. IEEE 83(5), 705–740 (1995)

    Article  Google Scholar 

  4. Yu, H., Yang, J.: A direct LDA algorithm for high-dimensional data with application to face recognition. Pattern Recog. 34(10), 2067–2070 (2001)

    Article  MATH  Google Scholar 

  5. Mao, J., Jain, A.K.: Discriminant analysis neural networks. In: IEEE Int. Conf. on Neural Networks, CA, pp. 300–305 (1993)

    Google Scholar 

  6. Chatterjee, C., Roychowdhurry, V.P.: On self-organizing algorithm and networks for class separability features. IEEE Trans. Neural Network 8(3), 663–678 (1997)

    Article  Google Scholar 

  7. Abrishami Moghaddam, H., Matinfar, M., Sajad Sadough, S.M., Amiri Zadeh, Kh.: Algorithms and networks for accelerated convergence of adaptive LDA. Pattern Recognition 38(4), 473–483 (2005)

    Article  MATH  Google Scholar 

  8. Hongo, H., Yasumoto, N., Niva, Y., Yamamoto, K.: Hierarchical face recognition using an adaptive discriminant space. In: TENCON 2002. Proc. IEEE Int. Conf. Computers Communications, Control and Power Engineering, vol. 1, pp. 523–528. IEEE Computer Society Press, Los Alamitos (2002)

    Google Scholar 

  9. Rao, Y., Principe, N., Wong, J.C.: Fast RLS like algorithm for generalized eigen decomposition and its applications. Journal of VLSI Signal processing systems 37(3), 333–344 (2004)

    Article  MATH  Google Scholar 

  10. Sanger, T.D.: Optimal unsupervised learning in a single-layer linear feed forward neural network. Neural Networks 2, 459–473 (1989)

    Article  Google Scholar 

  11. Magnus, J.R., Neudecker, H.: Matrix Differential Calculus. John Wiley, Chichester (1999)

    MATH  Google Scholar 

  12. Widrow, B., Stearns, S.: Adaptive Signal Processing. Prentice-Hall, Englewood Cliffs (1985)

    MATH  Google Scholar 

  13. Ljung, L.: Analysis of recursive stochastic algorithms. IEEE Trans. Automat Control 22, 551–575 (1977)

    Article  MATH  MathSciNet  Google Scholar 

  14. Okada, T., Tomita, S.: An Optimal orthonormal system for discriminant analysis. Pattern Recognition 18(2), 139–144 (1985)

    Article  Google Scholar 

  15. Belhumeur, P.N., Hespanha, J.P., Kriegman, D.J.: Eigenfaces vs. Fisher faces: Recognition using class specific linear projection. IEEE Trans. Pattern Anal. Machine Intel. 19, 711–720 (1997)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Mohamed Kamel Aurélio Campilho

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ghassabeh, Y.A., Moghaddam, H.A. (2007). A New Incremental Optimal Feature Extraction Method for On-Line Applications. In: Kamel, M., Campilho, A. (eds) Image Analysis and Recognition. ICIAR 2007. Lecture Notes in Computer Science, vol 4633. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74260-9_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74260-9_36

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74258-6

  • Online ISBN: 978-3-540-74260-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics