Skip to main content

A New Cache Replacement Algorithm in SMO

  • Conference paper
  • First Online:
Pattern Recognition with Support Vector Machines (SVM 2002)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2388))

Included in the following conference series:

  • 2019 Accesses

Abstract

In the methods for training Support Vector Machines (SVM), precomputed elements in the Hessian matrix are usually cached in order to avoid recomputation. However, the least-recent-used replacement algorithm that is widely used is not suitable since the elements are requested in an irregular way. A new cache replacement algorithm applied in Sequential Minimal Optimization (SMO) is presented in the paper. The item corresponding to the component with minimal violation of the Karush-Kuhn-Tucher (KKT) condition is deleted to make room for new one when the cache is full. It is shown in the experiments that the hit ratio of the cache is improved compared with LRU cache while the training time can be reduced in the tasks where the computation of elements in Hessian matrix is very time-consuming.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Reference

  1. Vladimir N. Vapnik. The Nature of Statistical Learning Theory, Springer-Verlag, New York, 2000

    MATH  Google Scholar 

  2. Christopher J. C. Burges. A Tutorial on Support Vector Machines for Pattern Recognition, Data Mining and Knowledge Discovery, 1998, 2(2): 1–43

    Article  Google Scholar 

  3. Kristin P. Bennett, Colin Campbell. Support Vector Machines: Hype or Hallelujah? SIGKDD Explorations, 2000, 2(2): 1–13

    Article  Google Scholar 

  4. Osuna E., Freund R., Girosi F. An Improved Training Algorithm for Support Vector Machines. Principe J, Gile L, Morgan N, et al. Proceedings of the 1997 IEEE Workshop on Neural Networks for Signal Processing. IEEE, 1997. 276–285

    Google Scholar 

  5. Thorsten Joachims. Making large-scale support vector machine learning practical, In B. Schölkopf, C. J. C. Burges, and A. J. Smola, editors, Advances in Kernel Methods — Support Vector Learning, pages 169–184, Cambridge, MA, 1999, MIT Press

    Google Scholar 

  6. John C. Platt. Using Analytic QP and Sparseness to Speed Training of Support Vector Machines, In Advances in Neural Information Processing Systems 11, M. S. Kearns, S. A. Solla, D. A. Cohn, eds., MIT Press, 1999

    Google Scholar 

  7. S. S. Keerthi, E. G. Gilbert. Convergence of a Generalized SMO Algorithm for SVM Classifier Design, Machine Learning, vol. 46, nos. 1–3, Jan., 2002, pp. 351–360

    Article  MATH  Google Scholar 

  8. Hush D., Scovel C. Polynomial-Time Decomposition Algorithms for Support Vector Machines. LANL Technical Report LA-UR-00-3800, Los Alamos: Los Alamos National Laboratory, 2000

    Google Scholar 

  9. Pérez-Cruz F., Alarcón-Diana P., Navia-Vázquez A., et al. Fast Training of Support Vector Classifiers. Leen T, Dietterich T, Tresp V. Advances in Neural Information Processing Systems 13. Cambridge, MA: MIT Press, 2001. 734–740

    Google Scholar 

  10. Chih-Chung Chang, Chih-Jen Lin. LIBSVM: a Library for Support Vector Machines (Version 2.3), 2001. http://www.csie.nta.edu.tw/~cjlin/papers/ libsvm.pdf

  11. R. Collobert, S. Bengio. SVMTorch: A support vector machine for large-scale regression and classification problems. Journal of Machine Learning Research, 1:143–160, 2001

    Article  MathSciNet  Google Scholar 

  12. S. S. Keerthi, S. K. Shevade, C. Bhattcharyya, K. R. K. Murthy. Improvements to Platt’s SMO Algorithm for SVM Classifier Design, Neural Computation, Vol. 13, March 2001, pp. 637–649

    Google Scholar 

  13. Pavel Laskov. Feasible direction decomposition algorithms for training support vector machines, Machine Learning, vol. 46, nos. 1–3, Jan., 2002, pp. 315–349

    Article  MATH  Google Scholar 

  14. Gary William Flake, Steve Lawrence. Efficient SVM Regression Training with SMO, Machine Learning, volume 46, nos. 1–3, Jan., 2002, pp. 271–290

    Article  MATH  Google Scholar 

  15. Jianmin Li, Bo Zhang, and Fuzong Lin. A New Strategy for Selecting Working Sets Applied in SMO, to appear in ICPR 2002.

    Google Scholar 

  16. Dennis DeCoste and Bernhard Schoelkopf Training Invariant Support Vector Machines Machine Learning, vol. 46, nos. 1–3, Jan., 2002

    Google Scholar 

  17. http://www.research.microsoft.com/~jplatt/adult.zip

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Li, J., Zhang, B., Lin, F. (2002). A New Cache Replacement Algorithm in SMO. In: Lee, SW., Verri, A. (eds) Pattern Recognition with Support Vector Machines. SVM 2002. Lecture Notes in Computer Science, vol 2388. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45665-1_27

Download citation

  • DOI: https://doi.org/10.1007/3-540-45665-1_27

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-44016-1

  • Online ISBN: 978-3-540-45665-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics