Abstract
In the methods for training Support Vector Machines (SVM), precomputed elements in the Hessian matrix are usually cached in order to avoid recomputation. However, the least-recent-used replacement algorithm that is widely used is not suitable since the elements are requested in an irregular way. A new cache replacement algorithm applied in Sequential Minimal Optimization (SMO) is presented in the paper. The item corresponding to the component with minimal violation of the Karush-Kuhn-Tucher (KKT) condition is deleted to make room for new one when the cache is full. It is shown in the experiments that the hit ratio of the cache is improved compared with LRU cache while the training time can be reduced in the tasks where the computation of elements in Hessian matrix is very time-consuming.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Reference
Vladimir N. Vapnik. The Nature of Statistical Learning Theory, Springer-Verlag, New York, 2000
Christopher J. C. Burges. A Tutorial on Support Vector Machines for Pattern Recognition, Data Mining and Knowledge Discovery, 1998, 2(2): 1–43
Kristin P. Bennett, Colin Campbell. Support Vector Machines: Hype or Hallelujah? SIGKDD Explorations, 2000, 2(2): 1–13
Osuna E., Freund R., Girosi F. An Improved Training Algorithm for Support Vector Machines. Principe J, Gile L, Morgan N, et al. Proceedings of the 1997 IEEE Workshop on Neural Networks for Signal Processing. IEEE, 1997. 276–285
Thorsten Joachims. Making large-scale support vector machine learning practical, In B. Schölkopf, C. J. C. Burges, and A. J. Smola, editors, Advances in Kernel Methods — Support Vector Learning, pages 169–184, Cambridge, MA, 1999, MIT Press
John C. Platt. Using Analytic QP and Sparseness to Speed Training of Support Vector Machines, In Advances in Neural Information Processing Systems 11, M. S. Kearns, S. A. Solla, D. A. Cohn, eds., MIT Press, 1999
S. S. Keerthi, E. G. Gilbert. Convergence of a Generalized SMO Algorithm for SVM Classifier Design, Machine Learning, vol. 46, nos. 1–3, Jan., 2002, pp. 351–360
Hush D., Scovel C. Polynomial-Time Decomposition Algorithms for Support Vector Machines. LANL Technical Report LA-UR-00-3800, Los Alamos: Los Alamos National Laboratory, 2000
Pérez-Cruz F., Alarcón-Diana P., Navia-Vázquez A., et al. Fast Training of Support Vector Classifiers. Leen T, Dietterich T, Tresp V. Advances in Neural Information Processing Systems 13. Cambridge, MA: MIT Press, 2001. 734–740
Chih-Chung Chang, Chih-Jen Lin. LIBSVM: a Library for Support Vector Machines (Version 2.3), 2001. http://www.csie.nta.edu.tw/~cjlin/papers/ libsvm.pdf
R. Collobert, S. Bengio. SVMTorch: A support vector machine for large-scale regression and classification problems. Journal of Machine Learning Research, 1:143–160, 2001
S. S. Keerthi, S. K. Shevade, C. Bhattcharyya, K. R. K. Murthy. Improvements to Platt’s SMO Algorithm for SVM Classifier Design, Neural Computation, Vol. 13, March 2001, pp. 637–649
Pavel Laskov. Feasible direction decomposition algorithms for training support vector machines, Machine Learning, vol. 46, nos. 1–3, Jan., 2002, pp. 315–349
Gary William Flake, Steve Lawrence. Efficient SVM Regression Training with SMO, Machine Learning, volume 46, nos. 1–3, Jan., 2002, pp. 271–290
Jianmin Li, Bo Zhang, and Fuzong Lin. A New Strategy for Selecting Working Sets Applied in SMO, to appear in ICPR 2002.
Dennis DeCoste and Bernhard Schoelkopf Training Invariant Support Vector Machines Machine Learning, vol. 46, nos. 1–3, Jan., 2002
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Li, J., Zhang, B., Lin, F. (2002). A New Cache Replacement Algorithm in SMO. In: Lee, SW., Verri, A. (eds) Pattern Recognition with Support Vector Machines. SVM 2002. Lecture Notes in Computer Science, vol 2388. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45665-1_27
Download citation
DOI: https://doi.org/10.1007/3-540-45665-1_27
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44016-1
Online ISBN: 978-3-540-45665-0
eBook Packages: Springer Book Archive