Abstract
Matrix-pattern-oriented linear classifier design has been proven successful in improving classification performance. This paper proposes an efficient kernelized classifier for Matrixized Least Square Support Vector Machine (MatLSSVM). The classifier is realized by introducing a kernel-induced distance metric and a majority-voting technique into MatLSSVM, and thus is named Kernel-based Matrixized Least Square Support Vector Machine (KMatLSSVM). Firstly, the original Euclidean distance for optimizing MatLSSVM is replaced by a kernel-induced distance, then different initializations for the weight vectors are given and the correspondingly generated sub-classifiers are combined with the majority vote rule, which can expand the solution space and mitigate the local solution of the original MatLSSVM. The experiments have verified that one iteration is enough for each sub-classifier of the presented KMatLSSVM to obtain a superior performance. As a result, compared with the original linear MatLSSVM, the proposed method has significant advantages in terms of classification accuracy and computational complexity.


Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Notes
Available at http://www.cam-orl.co.uk.
References
Asuncion A, Newman DJ (2007) UCI machine learning repository
Belhumeur P, Hespanha J, Kriegman D (1997) Eigenfaces vs. fisherfaces: recognition using class specific linear projection. IEEE Trans Pattern Anal Mach Intell 19(7):711–720
Cai D, He X, Hu Y, Han J, Huang T (2007) Learning a spatially smooth subspace for face recognition. IEEE Conf Comput Vis Pattern Recogn 1–7
Caruana R, Lawrence S, Giles CL (2001) Over fitting in neural networks: backpropagation, conjugate gradient, and early stopping. Adv Neural Inform Process Syst 13:402–408
Chen S, Wang Z, Tian Y (2007) Matrix-pattern-oriented hockashyap classifier with regularization learning. Pattern Recogn 40:1533–1543
Chen S, Zhu Y, Zhang D, Yang J (2005) Feature extraction approaches based on matrix pattern:matpca and matflda. Pattern Recogn Lett 26:1157–1167
Collobert R, Bengio S (2004) Links between perceptrons mlps and svms. In: ICML
Graham A (1981) Kronecker products and matrix calculus with applications. Horwood Halsted Press, New York
Hagiwara K (2002) Regularization learning, early stopping and biased estimator. Neurocomputing 48(1–4):937–955
Kirby M, Sirovich L (1990) Application of the karhunen-loeve procedure for the characterization of human faces. IEEE Trans Pattern Anal Mach Intell 12(1):103–108
Krebel U (1999) Pairwise classification and support vector machines. Advances in kernel methods: support vector learning, MIT Press Cambridge, USA, pp 255–268
Muller K, Mika S, Ratsch G, Tsuda K, Scholkopf B (2001) An introduction to kernel-based learning algorithms. IEEE Trans Neural Netw 12(2):181–201
Prechelt L (1999) Early stopping-but when?. In: Orr GB, Muller KR (eds) Neural networks: tricks of the trade, LNCS 1524. Springer, Berlin, pp 55–69
Suykens J, Van Gestel T, De Brabanter J, De Moor B, Vandewalle J (2002) Least squares support vector machines. World Scientific, Singapore
Suykens J, Vandewalle J (1999) Least squares support vector machine classifiers. Neural Process Lett 9(3):293–300
Tao D, Li X, Hu W, Maybank S, Wu X (2005) Supervised tensor learning. IEEE Int Conf Data Min 450–457
Tao D, Li X, Wu X, Hu W, Maybank S (2007) Supervised tensor learning. Knowl Inform Syst 13:1–42
Vapnik V (1998) Statistical learning theory. Wiley, London
Vapnik V, Chervonenkis A (1971) On the uniform convergence of relative frequencies of events to their probabilities. Theory Prob Appl 16(2):264–280
Wang Z, Chen S (2007) New least squares support vector machines based on matrix patterns. Neural Process Lett 26:41–56
Wang Z, Chen S, Liu J, Zhang D (2008) Pattern representation in feature extraction and classifier design: matrix versus vector. IEEE Trans Neural Netw 19(5):758–769
Wolf L, Jhuang H, Hazan T (2007) Modeling appearances with low-rank svm. IEEE Conf Comput Vis Pattern Recogn 1–6
Yang J, Zhang D, Frangi A, Yang J (2004) Two-dimension pca: a new approach to appearance-based face representation and recognition. IEEE Trans Pattern Anal Mach Intell 26(1):131–137
Acknowledgments
The authors would like to thank Natural Science Foundations of China under Grant No. 60903091, and the Specialized Research Fund for the Doctoral Program of Higher Education under Grant No. 20090074120003 for partial support. This work is also supported by the Open Projects Program of National Laboratory of Pattern Recognition.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Wang, Z., He, X., Gao, D. et al. An efficient Kernel-based matrixized least squares support vector machine. Neural Comput & Applic 22, 143–150 (2013). https://doi.org/10.1007/s00521-011-0677-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-011-0677-4