Abstract
In general, the similar input data have the similar output target values. A novel Fast Support Vector Regression (FSVR) is proposed on the reduced training set. Firstly, the improved learning machine divides the training data into blocks by using the traditional clustering methods, such as K-mean and FCM clustering techniques. Secondly, the membership function on each block is defined by the corresponding target values of the training data, all the training data have the membership degree falling into the interval [0, 1], which can vary the penalty coefficient by multiplying C. Thirdly, the reduced training set is used to training FSVR, which consists of the data with the membership degrees, which are greater than or equal to the selected suitable parameter λ. The experimental results on the traditional machine learning data sets show that the FSVR can not only achieve the better or acceptable performance but also downsize the number of training data and speed up training.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Vapnik, V.N.: Statistical Learning Theory. John Wiley & Sons, New York (1998)
Wee, J.W., Lee, C.H.: Concurrent Support Vector Machine Processor for Disease Diagnosis. In: Pal, N.R., Kasabov, N., Mudi, R.K., Pal, S., Parui, S.K. (eds.) ICONIP 2004. LNCS, vol. 3316, pp. 1129–1134. Springer, Heidelberg (2004)
Buciu, L., Kotropoulos, C., Pitas, I.: Combining Support Vector Machines for Accurate Face Detection. In: Proceeding of International Conference on Image Processing, Thessaloniki, Greece, pp. 1054–1057 (2001)
Yang, H., Chan, L., King, I.: Support Vector Machine Regression for Volatile Stock Market Prediction. In: Jünger, M., Naddef, D. (eds.) Computational Combinatorial Optimization. LNCS, vol. 2241, pp. 391–396. Springer, Heidelberg (2001)
Ding, A., Zhao, X., Jiao, L.: Traffic Flow Time Series Prediction Based on Statistics Learning Theory. In: Proceedings of International Conference on Intelligent Transportation System, Singapore, pp. 727–730 (2002)
Osuna, E., Freund, R., Girosi, F.: An Improved Training Algorithm for Support Vector Machines. In: Proceedings of Workshop on Neural Networks for Signal Processing, Amelea Island, pp. 276–285 (1997)
Platt, J.C.: Fast Training of Support Vector Machines Using Sequential Minimal Optimization. In: Advances in Kernel Methods: Support Vector Learning, pp. 185–208 (1999)
Collobert, R., Bengio, S.: SVMTorch: Support vector machines for large-scale regression problems. Journal of Machine Learning 1(2), 143–160 (2001)
Chang, C.C., Lin, C.J.: LIBSVM: A Library for Support Vector Machines (2001), http://www.csie.ntu.edu.tw/~cjlin
Keerthi, S.S., DeCoste, D.M.: A Modified Finite Newton Method for Fast Solution of Large Scale Linear SVMs. Journal of Machine Learning Research 6, 341–361 (2005)
Girolami, M.: Orthogonal Series Density Estimation and the Kernel Eigenvalue Problem. Neural Computation 14(3), 669–688 (2002)
Shin, H.J., Cho, S.: Invariance of Neighborhood Relation Under Input Space to Feature Space Mapping. Pattern Recognition Letter 26(6), 701–718 (2004)
Wang, W.J., Xu, Z.B.: A Heuristic Training for Support Vector Regression. Neurocomputing 61, 259–275 (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Zhou, W., Xiong, Y., Wu, Ca., Liu, H. (2011). Fast Support Vector Regression Based on Cut. In: Tan, Y., Shi, Y., Chai, Y., Wang, G. (eds) Advances in Swarm Intelligence. ICSI 2011. Lecture Notes in Computer Science, vol 6729. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21524-7_44
Download citation
DOI: https://doi.org/10.1007/978-3-642-21524-7_44
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21523-0
Online ISBN: 978-3-642-21524-7
eBook Packages: Computer ScienceComputer Science (R0)