Abstract
This paper proposes two methods which take advantage of k-mean clustering algorithm to decrease the number of support vectors (SVs) for the training of support vector machine (SVM). The first method uses k-mean clustering to construct a dataset of much smaller size than the original one as the actual input dataset to train SVM. The second method aims at reducing the number of SVs by which the decision function of the SVM classifier is spanned through k-mean clustering. Finally, Experimental results show that this improved algorithm has better performance than the standard Sequential Minimal Optimization (SMO) algorithm.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Vapnik, V.: The nature of statistical learning theory. Springer, Heidelberg (1995)
Huang, D.S., Ip Horace, H.S., Law Ken, C.K., Zheru, C.: Zeroing polynomials using modified constrained neural network approach. IEEE Trans. on Neural Networks 16(3), 721–732 (2005)
Huang, D.S., Ip Horace, H.S., Zheru, C.: A neural root finder of polynomials based on root moments. Neural Computation 16(8), 1721–1762 (2004)
Huang, D.S.: A constructive approach for finding arbitrary roots of polynomials by neural networks. IEEE Transactions on Neural Networks 15(2), 477–491 (2004)
Huang, D.S.: Systematic Theory of Neural Networks for Pattern Recognition. Publishing House of Electronic Industry of China, Beijing (1996)
Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines and Other Kernel-based Learning Methods. Cambridge University Press, Cambridge (2000)
Sun, B.-Y., Huang, D.S., Fang, H.-T.: Lidar signal de-noising using least squares support vector machine. IEEE Signal Processing Letters 12(2), 101–104 (2005)
Sun, B.-Y., Huang, D.S.: Least squares support vector machine ensemble. In: The 2004 International Joint Conference on Neural Networks (IJCNN 2004), Budapest Hungary, pp. 2013–2016 (2004)
Joachims, T.: Making large-scale support vector machine learning practical. In: Advances in kernel methods: support vector learning, pp. 169–184. MIT Press, Cambridge (1999)
Platt, J.: Sequential Minimal Optimization: A Fast Algorithm for Training Support Vector Machines. In: Advances in kernel methods: support vector learning, pp. 185–208. MIT Press, Cambridge (1999)
Jain, A.K., Dubes, R.C.: Algorithms for Clustering Data. Prentice-Hall, Englewood Cliffs (1988)
Scholkopf, B., Smola, A.J.: Learning with Kernels. MIT Press, MA (2001)
Friedman, J.H., Baskett, F., Shustek, L.J.: An algorithm for finding nearest neighbours. IEEE Transactions on Computers C-24, 1000–1006 (1975)
Riply, B.D.: Neural networks and related methods for classifications. J. Royal Statistical Soc. Series B 56, 409–456 (1994)
ftp://ftp.cs.wisc.edu/math-prog/cpo-dataset/machine-learn/checker
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Xia, XL., Lyu, M.R., Lok, TM., Huang, GB. (2005). Methods of Decreasing the Number of Support Vectors via k-Mean Clustering. In: Huang, DS., Zhang, XP., Huang, GB. (eds) Advances in Intelligent Computing. ICIC 2005. Lecture Notes in Computer Science, vol 3644. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11538059_75
Download citation
DOI: https://doi.org/10.1007/11538059_75
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-28226-6
Online ISBN: 978-3-540-31902-3
eBook Packages: Computer ScienceComputer Science (R0)