Skip to main content

Fast Support Vector Training by Newton’s Method

  • Conference paper
Book cover Artificial Neural Networks and Machine Learning – ICANN 2011 (ICANN 2011)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6792))

Included in the following conference series:

Abstract

We discuss a fast training method of support vector machines using Newton’s method combined with fixed-size chunking. To speed up training, we limit the number of upper or lower bounded variables in the working set to two so that the corrections of the variables do not violate the bounding conditions. If similar working sets occur alternately, we merge these two working sets into one, and if similar working sets occur consecutively, we use incremental Cholesky factorization in calculating corrections. By computer experiments, we show that the proposed method is comparable to or faster than SMO (Sequential minimum optimization) using the second order information.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Abe, S.: Support Vector Machines for Pattern Classification. Springer, Heidelberg (2010)

    Book  Google Scholar 

  2. Platt, J.C.: Fast training of support vector machines using sequential minimal Optimization. In: Advances in Kernel Methods: Support Vector Learning, pp. 185–208. MIT Press, Cambridge (1999)

    Google Scholar 

  3. Fan, R.-E., Chen, P.-H., Lin, C.-J.: Working set selection using second order information for training support vector machines. Journal of Machine Learning Research 6, 1889–1918 (2005)

    MathSciNet  MATH  Google Scholar 

  4. Barbero, Á., Dorronsoro, J.R.: Faster directions for second order SMO. In: Diamantaras, K., Duch, W., Iliadis, L.S. (eds.) ICANN 2010. LNCS, vol. 6353, pp. 30–39. Springer, Heidelberg (2010)

    Chapter  Google Scholar 

  5. Keerthi, S.S., Gilbert, E.G.: Convergence of a generalized SMO algorithm for SVM classifier Design. Machine Learning 46(1-3), 351–360 (2002)

    Article  Google Scholar 

  6. Abe, S.: Is primal better than dual. In: Alippi, C., Polycarpou, M., Panayiotou, C., Ellinas, G. (eds.) ICANN 2009. LNCS, vol. 5768, pp. 854–863. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  7. Chang, C.-C., Lin, C.-J.: LIBSVM–A library for support vector machines, http://www.csie.ntu.edu.tw/~cjlin/libsvm/

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Abe, S. (2011). Fast Support Vector Training by Newton’s Method. In: Honkela, T., Duch, W., Girolami, M., Kaski, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2011. ICANN 2011. Lecture Notes in Computer Science, vol 6792. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21738-8_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-21738-8_19

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-21737-1

  • Online ISBN: 978-3-642-21738-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics