Skip to main content
Log in

A new learning schema based on support vector for multi-classification

  • BIC-TA 2006
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

A novel learning schema SVCMR based on support vector is proposed in this paper to address M-class classification issue. It creates a tree-shaped decision frame where M/2 nodes are constructed with the three-separation model as the basic classifier. A class selection rule is defined to ensure basic classifiers be trained in turn on pair of classes with maximum feature distance. Class contours are extracted as data representatives to reduce training set size. Another point is that parameters involved in SVCMR are learned from data neighborhood, which brings adaptation to various datasets and avoids pricy cost spent on searching parameter spaces. Experiments on real datasets demonstrate the performance of SVCMR can be competitive to those state-of-the-art classifiers but with the higher effectiveness than them.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

References

  1. Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines. Cambridge University Press, London

    Google Scholar 

  2. Hastie TJ, Tibshirani RJ (1998) Classification by pairwise coupling. In: Jordan MI, Kearns MJ, Solla SA (eds) Advances in neural information processing systems, vol 10. MIT Press, Cambridge, pp 507–513

  3. Vapnik V (1998) Statistical learning theory. Wiley, New York

    MATH  Google Scholar 

  4. Platt J, Cristianini N, Shawe-Taylor J (2000) Large margin DAGs for multi-class classification. Advances in neural information processing systems, vol 12. MIT Press, Cambridge, pp 547–553

  5. Crammer K, Singer Y (2002) On the learn ability and design of output codes for multi-class problems. Mach Learn 47:201–233

    Article  MATH  Google Scholar 

  6. Angulo C, Parra X, Catala A (2003) K-SVCR, a support vector machine for multi-class classification. Neurocomputing 55:57–77

    Article  Google Scholar 

  7. Zhong P, Fukushima M (2006) A new multi-class support vector algorithm. Optim Methods Softw 21(3):359–372

    Article  MATH  MathSciNet  Google Scholar 

  8. Ben-Hur A, Horn D, Siegelmann HT (2001) Support vector clustering. In: Journal of Machine Learning Research, pp 125–137

  9. Daniel B, Dongwei C (2004) Training support vector machine using adaptive clustering. In: SIAM International Conference on Data Mining, pp 126–137

  10. Wang LP (ed) (2005) Support vector machines: theory and application. Springer, Berlin

  11. Lyhyaoui A, Martinez M, Mora I (1999) Sample selection via clustering to construct support vector-like classifiers. IEEE Trans Neural Netw 10(6):1474–1481

    Article  Google Scholar 

  12. Shin HJ, Cho SZ (2002) Pattern selection for support vector classifiers. In: Proceedings of the 3rd International Conference on Intelligent Data Engineering and Automated Learning. Manchester, UK, pp 469–474

  13. Frauke F, Christian I (2005) Evolutionary tuning of multiple SVM parameters. Neurocomputing 64(C):107–117

    Google Scholar 

  14. Kaibo D, Sathiya KS, Aun NP (2003) A new evaluation of simple performance measures for tuning SVM hyperparameters. Neurocomputing (51):41–59

  15. Chin CH, Chih W, Shih C et al (2006) Dynamically optimizing parameters in support vector regression: an application of electricity load forecasting. In: Proceedings of the 39th Hawaii International Conference on System Sciences

  16. Mackay D (1991) Bayesian modeling and neural networks. Ph.D. Thesis, California Institute of Technology Pasadena, CA

  17. Akaike H (1974) A new look at the statistical model identification. IEEE Trans Autom Control 19(6):716–723

    Article  MATH  MathSciNet  Google Scholar 

  18. Murata N, Yoshizawa S, Amari S (1994) Network information criterion-determining the number of hidden units for artificial neural network models. IEEE Trans Neural Netw 5:865–872

    Article  Google Scholar 

  19. http://www.ics.uci.edu/∼mlearn/MLSummary.html

  20. Weston J, Watkins C (1998) Multi-class support vector machines. CSD-TR-98-04 Royal Holloway, University of London, UK

  21. http://www.cs.cmu.edu/afs/cs/project/theo-11/www/naive-bayes.html

Download references

Acknowledgments

This work is supported by the National Natural Science Foundation of China under Grant No. 60433020; 985 Project: Technological Creation Support of Computation and Software Science; and the Key Laboratory for Symbol Computation and Knowledge Engineering of the National Education Ministry of China.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ling Ping.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ping, L., Chun-Guang, Z. A new learning schema based on support vector for multi-classification. Neural Comput & Applic 17, 119–127 (2008). https://doi.org/10.1007/s00521-007-0097-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-007-0097-7

Keywords

Navigation