Skip to main content

Multi-class Support Vector Machine Simplification

  • Conference paper
PRICAI 2008: Trends in Artificial Intelligence (PRICAI 2008)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 5351))

Included in the following conference series:

Abstract

In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution – support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Boser, B.E., Guyon, I.M., Vapnik, V.N.: A training algorithm for optimal margin classifiers. In: Proceedings of the Fifth Annual Workshop on Computational Learning Theory. COLT 1992, pp. 144–152. ACM, New York (1992)

    Google Scholar 

  2. Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2(2), 955–974 (1998)

    Article  Google Scholar 

  3. Burges, C.J.C.: Simplified support vector decision rules. In: Proc. 13th International Conference on Machine Learning, San Mateo, CA, pp. 71–77 (1996)

    Google Scholar 

  4. Tang, B., Mazzoni, D.: Multiclass reduced-set support vector machines. In: Proceedings of the 23rd international Conference on Machine Learning, ICML 2006, Pittsburgh, Pennsylvania, June 25-29, 2006, vol. 148, pp. 921–928. ACM, New York (2006)

    Google Scholar 

  5. Nguyen, D., Ho, T.: An efficient method for simplifying support vector machines. In: Proceedings of the 22nd international Conference on Machine Learning, ICML 2005, Bonn, Germany, August 07-11, 2005, vol. 119, pp. 617–624. ACM, New York (2005)

    Google Scholar 

  6. Schölkopf, B., Mika, S., Burges, C.J.C., Knirsch, P., Müller, K.-R., Rätsch, G., Smola, A.: Input Space vs. Feature Space in Kernel-Based Methods. IEEE Trans. Neural Networks 10, 1000–1017 (1999)

    Article  Google Scholar 

  7. Fan, R.-E., Chen, P.-H., Lin, C.-J.: Working set selection using second order information for training SVM. Journal of Machine Learning Research 6, 1889–1918 (2005), http://www.csie.ntu.edu.tw/~cjlin/libsvm

    MATH  Google Scholar 

  8. Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical recipes in C++: the art of scientific computing. Cambridge University Press, Cambridge (2002)

    MATH  Google Scholar 

  9. http://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/multiclass.html

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Nguyen, D., Matsumoto, K., Hashimoto, K., Takishima, Y., Takatori, D., Terabe, M. (2008). Multi-class Support Vector Machine Simplification. In: Ho, TB., Zhou, ZH. (eds) PRICAI 2008: Trends in Artificial Intelligence. PRICAI 2008. Lecture Notes in Computer Science(), vol 5351. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-89197-0_74

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-89197-0_74

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-89196-3

  • Online ISBN: 978-3-540-89197-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics