Abstract
A novel learning schema SVCMR based on support vector is proposed in this paper to address M-class classification issue. It creates a tree-shaped decision frame where M/2 nodes are constructed with the three-separation model as the basic classifier. A class selection rule is defined to ensure basic classifiers be trained in turn on pair of classes with maximum feature distance. Class contours are extracted as data representatives to reduce training set size. Another point is that parameters involved in SVCMR are learned from data neighborhood, which brings adaptation to various datasets and avoids pricy cost spent on searching parameter spaces. Experiments on real datasets demonstrate the performance of SVCMR can be competitive to those state-of-the-art classifiers but with the higher effectiveness than them.
Similar content being viewed by others
References
Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines. Cambridge University Press, London
Hastie TJ, Tibshirani RJ (1998) Classification by pairwise coupling. In: Jordan MI, Kearns MJ, Solla SA (eds) Advances in neural information processing systems, vol 10. MIT Press, Cambridge, pp 507–513
Vapnik V (1998) Statistical learning theory. Wiley, New York
Platt J, Cristianini N, Shawe-Taylor J (2000) Large margin DAGs for multi-class classification. Advances in neural information processing systems, vol 12. MIT Press, Cambridge, pp 547–553
Crammer K, Singer Y (2002) On the learn ability and design of output codes for multi-class problems. Mach Learn 47:201–233
Angulo C, Parra X, Catala A (2003) K-SVCR, a support vector machine for multi-class classification. Neurocomputing 55:57–77
Zhong P, Fukushima M (2006) A new multi-class support vector algorithm. Optim Methods Softw 21(3):359–372
Ben-Hur A, Horn D, Siegelmann HT (2001) Support vector clustering. In: Journal of Machine Learning Research, pp 125–137
Daniel B, Dongwei C (2004) Training support vector machine using adaptive clustering. In: SIAM International Conference on Data Mining, pp 126–137
Wang LP (ed) (2005) Support vector machines: theory and application. Springer, Berlin
Lyhyaoui A, Martinez M, Mora I (1999) Sample selection via clustering to construct support vector-like classifiers. IEEE Trans Neural Netw 10(6):1474–1481
Shin HJ, Cho SZ (2002) Pattern selection for support vector classifiers. In: Proceedings of the 3rd International Conference on Intelligent Data Engineering and Automated Learning. Manchester, UK, pp 469–474
Frauke F, Christian I (2005) Evolutionary tuning of multiple SVM parameters. Neurocomputing 64(C):107–117
Kaibo D, Sathiya KS, Aun NP (2003) A new evaluation of simple performance measures for tuning SVM hyperparameters. Neurocomputing (51):41–59
Chin CH, Chih W, Shih C et al (2006) Dynamically optimizing parameters in support vector regression: an application of electricity load forecasting. In: Proceedings of the 39th Hawaii International Conference on System Sciences
Mackay D (1991) Bayesian modeling and neural networks. Ph.D. Thesis, California Institute of Technology Pasadena, CA
Akaike H (1974) A new look at the statistical model identification. IEEE Trans Autom Control 19(6):716–723
Murata N, Yoshizawa S, Amari S (1994) Network information criterion-determining the number of hidden units for artificial neural network models. IEEE Trans Neural Netw 5:865–872
Weston J, Watkins C (1998) Multi-class support vector machines. CSD-TR-98-04 Royal Holloway, University of London, UK
http://www.cs.cmu.edu/afs/cs/project/theo-11/www/naive-bayes.html
Acknowledgments
This work is supported by the National Natural Science Foundation of China under Grant No. 60433020; 985 Project: Technological Creation Support of Computation and Software Science; and the Key Laboratory for Symbol Computation and Knowledge Engineering of the National Education Ministry of China.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Ping, L., Chun-Guang, Z. A new learning schema based on support vector for multi-classification. Neural Comput & Applic 17, 119–127 (2008). https://doi.org/10.1007/s00521-007-0097-7
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-007-0097-7