Skip to main content
Log in

An efficient Kernel-based matrixized least squares support vector machine

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Matrix-pattern-oriented linear classifier design has been proven successful in improving classification performance. This paper proposes an efficient kernelized classifier for Matrixized Least Square Support Vector Machine (MatLSSVM). The classifier is realized by introducing a kernel-induced distance metric and a majority-voting technique into MatLSSVM, and thus is named Kernel-based Matrixized Least Square Support Vector Machine (KMatLSSVM). Firstly, the original Euclidean distance for optimizing MatLSSVM is replaced by a kernel-induced distance, then different initializations for the weight vectors are given and the correspondingly generated sub-classifiers are combined with the majority vote rule, which can expand the solution space and mitigate the local solution of the original MatLSSVM. The experiments have verified that one iteration is enough for each sub-classifier of the presented KMatLSSVM to obtain a superior performance. As a result, compared with the original linear MatLSSVM, the proposed method has significant advantages in terms of classification accuracy and computational complexity.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2

Similar content being viewed by others

Notes

  1. Available at http://sun16.cecs.missouri.edu/pgader/CECS477/NNdigits.zip.

  2. Available at http://www.cam-orl.co.uk.

References

  1. Asuncion A, Newman DJ (2007) UCI machine learning repository

  2. Belhumeur P, Hespanha J, Kriegman D (1997) Eigenfaces vs. fisherfaces: recognition using class specific linear projection. IEEE Trans Pattern Anal Mach Intell 19(7):711–720

    Article  Google Scholar 

  3. Cai D, He X, Hu Y, Han J, Huang T (2007) Learning a spatially smooth subspace for face recognition. IEEE Conf Comput Vis Pattern Recogn 1–7

  4. Caruana R, Lawrence S, Giles CL (2001) Over fitting in neural networks: backpropagation, conjugate gradient, and early stopping. Adv Neural Inform Process Syst 13:402–408

    Google Scholar 

  5. Chen S, Wang Z, Tian Y (2007) Matrix-pattern-oriented hockashyap classifier with regularization learning. Pattern Recogn 40:1533–1543

    Article  MATH  Google Scholar 

  6. Chen S, Zhu Y, Zhang D, Yang J (2005) Feature extraction approaches based on matrix pattern:matpca and matflda. Pattern Recogn Lett 26:1157–1167

    Article  Google Scholar 

  7. Collobert R, Bengio S (2004) Links between perceptrons mlps and svms. In: ICML

  8. Graham A (1981) Kronecker products and matrix calculus with applications. Horwood Halsted Press, New York

    MATH  Google Scholar 

  9. Hagiwara K (2002) Regularization learning, early stopping and biased estimator. Neurocomputing 48(1–4):937–955

    Article  MATH  Google Scholar 

  10. Kirby M, Sirovich L (1990) Application of the karhunen-loeve procedure for the characterization of human faces. IEEE Trans Pattern Anal Mach Intell 12(1):103–108

    Article  Google Scholar 

  11. Krebel U (1999) Pairwise classification and support vector machines. Advances in kernel methods: support vector learning, MIT Press Cambridge, USA, pp 255–268

  12. Muller K, Mika S, Ratsch G, Tsuda K, Scholkopf B (2001) An introduction to kernel-based learning algorithms. IEEE Trans Neural Netw 12(2):181–201

    Article  Google Scholar 

  13. Prechelt L (1999) Early stopping-but when?. In: Orr GB, Muller KR (eds) Neural networks: tricks of the trade, LNCS 1524. Springer, Berlin, pp 55–69

    Google Scholar 

  14. Suykens J, Van Gestel T, De Brabanter J, De Moor B, Vandewalle J (2002) Least squares support vector machines. World Scientific, Singapore

    Book  MATH  Google Scholar 

  15. Suykens J, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300

    Article  MathSciNet  Google Scholar 

  16. Tao D, Li X, Hu W, Maybank S, Wu X (2005) Supervised tensor learning. IEEE Int Conf Data Min 450–457

  17. Tao D, Li X, Wu X, Hu W, Maybank S (2007) Supervised tensor learning. Knowl Inform Syst 13:1–42

    Article  Google Scholar 

  18. Vapnik V (1998) Statistical learning theory. Wiley, London

    MATH  Google Scholar 

  19. Vapnik V, Chervonenkis A (1971) On the uniform convergence of relative frequencies of events to their probabilities. Theory Prob Appl 16(2):264–280

    Google Scholar 

  20. Wang Z, Chen S (2007) New least squares support vector machines based on matrix patterns. Neural Process Lett 26:41–56

    Article  Google Scholar 

  21. Wang Z, Chen S, Liu J, Zhang D (2008) Pattern representation in feature extraction and classifier design: matrix versus vector. IEEE Trans Neural Netw 19(5):758–769

    Article  Google Scholar 

  22. Wolf L, Jhuang H, Hazan T (2007) Modeling appearances with low-rank svm. IEEE Conf Comput Vis Pattern Recogn 1–6

  23. Yang J, Zhang D, Frangi A, Yang J (2004) Two-dimension pca: a new approach to appearance-based face representation and recognition. IEEE Trans Pattern Anal Mach Intell 26(1):131–137

    Article  Google Scholar 

Download references

Acknowledgments

The authors would like to thank Natural Science Foundations of China under Grant No. 60903091, and the Specialized Research Fund for the Doctoral Program of Higher Education under Grant No. 20090074120003 for partial support. This work is also supported by the Open Projects Program of National Laboratory of Pattern Recognition.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhe Wang.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Wang, Z., He, X., Gao, D. et al. An efficient Kernel-based matrixized least squares support vector machine. Neural Comput & Applic 22, 143–150 (2013). https://doi.org/10.1007/s00521-011-0677-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-011-0677-4

Keywords

Navigation