Abstract
In this paper, we propose a Newton iterative method of solution for solving an ε-insensitive support vector regression formulated as an unconstrained optimization problem. The proposed method has the advantage that the solution is obtained by solving a system of linear equations at a finite number of times rather than solving a quadratic optimization problem. For the case of linear or kernel support vector regression, the finite termination of the Newton method has been proved. Experiments were performed on IBM, Google, Citigroup and Sunspot time series. The proposed method converges in at most six iterations. The results are compared with that of the standard, least squares and smooth support vector regression methods and of the exact solutions clearly demonstrate the effectiveness of the proposed method.




Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Bao Y-K, Liu Z-T, Guo L, Wang W (2005) Forecasting stock composite index by fuzzy support vector machines regression. In: Proceedings of 4th international conference on machine learning and cybernetics, Guangzhou, pp 18–21
Brockwell PJ, Davis RA (2002) Introduction to time series forecasting, 2nd edn. Springer, Berlin
Cao LJ (2003) Support vector machines experts for time series forecasting. Neurocomputing 51:321–339
Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines and other Kernel based learning methods. Cambridge University Press, Cambridge
Chen KY, Wang CH (2007) A hybrid SARIMA and support vector machines for forecasting the production values of the machinery industry in Taiwan. Expert Syst Appl 32(1):254–264
Demiriz A, Bennett K, Breneman C, Embrechts M (2001) Support vector machine regression in chemometrics. Comp Sci Stat. http://www.rpi.edu/~bennek/QSARINT.ps.gz
Fung G, Mangasarian OL (2001) Proximal support vector machine classifiers. In: Proceedings of 7th ACMSIGKDD international conference on knowledge discovery and data mining, pp 77–86
Fung G, Mangasarian OL (2003) Finite Newton method for Lagrangian support vector machine. Neurocomputing 55:39–55
Gunn SR (1998) Support vector machines for classification and regression. Technical Report, School of Electronics and Computer Science, University of Southampton, Southampton, UK. http://www.isis.ecs.soton.ac.uk/resources/svminfo/
Hiriart-Urruty J-B, Strodiot JJ, Nguyen VH (1984) Generalized Hessian matrix and second-order optimality conditions for problems with CL1 data. Appl Math Optim 11:43–56
Lee YJ, Mangasarian OL (2001) SSVM: a smooth support vector machine for classification. Comput Optim Appl 20(1):5–22
Lee YJ, Hsieh W-F, Huang C-M (2005) ε-SSVR: a smooth support vector machine for ɛ-insensitive regression. IEEE Trans Knowl Data Eng 17(5):678–685
Mangasarian OL (1995) Parallel gradient distribution in unconstrained optimization. SIAM J Control Optim 33(6):1916–1925
Mangasarian OL, Musicant DR (2001) Active set support vector machine classification. In: Leen TK, Dietterich TG, Tesp V (eds) Advances in neural information processing systems, vol 13. MIT Press, Cambridge, pp 577–586
Mangasarian OL (2002) A finite newton method for classification. Optim Methods Software 17:913–929
Mukherjee S, Osuna E, Girosi F (1997) Nonlinear prediction of chaotic time series using support vector machines. In: NNSP’97: neural networks for signal processing VII: proceedings of IEEE signal processing society workshop, Amelia Island, FL, USA, pp 511–520
Muller KR, Smola AJ, Ratsch G, Schölkopf B, Kohlmorgen J (1999) Using support vector machines for time series prediction. In: Schölkopf B, Burges CJC, Smola AJ (eds) Advances in Kernel methods-support vector learning. MIT Press, Cambridge, pp 243–254
Musicant DR, Feinberg A (2004) Active set support vector regression. IEEE Trans Neural Networks 15(2):268–275
Osuna E, Freund R, Girosi F (1997) Training support vector machines: an application to face detection. In: Proceedings on computer vision and pattern recognition, pp 130–136
Suykens JAK, Van Gestel T, De Brabanter J, De Moor B, Vandewalle J (2002) Least squares support vector machines. World Scientific, Singapore
Tang Z, Fishwick PA (1993) Feed forward neural nets as models for time series forecasting. ORSA J Comput 5:374–385
Tay FEH, Cao LJ (2001) Application of support vector machines in financial time series with forecasting. Omega 29(4):309–317
Tong Q, Zheng H, Wang X (2005) Gene prediction algorithm based on the statistical combination and the classification in terms of gene characteristics. In: International conference on neural networks and brain, vol 2, pp 673–677
Trafalis TB, Ince H (2000) Support vector machine for regression and applications to financial forecasting. Proc IEEE INNSENNS Int Joint Conf 16:348–353
Tseng FM, Yu HC, Tzeng GH (2002) Combining neural network model with seasonal time series ARIMA model. Technol Forecasting Social Change 69:71–87
Vapnik VN (1999) An overview of statistical learning theory. IEEE Trans Neural Networks 10(5):988–999
Vapnik VN (2000) The nature of statistical learning theory, 2nd edn. Springer, New York
Zhang GP, Patuwo EB, Hu MY (2001) A simulation study of artificial neural networks for nonlinear time series forecasting. Comput Oper Res 28:381–396
Zhang GP (2003) Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing 50:159–175
Acknowledgments
The authors would like to thank the referee for the valuable comments which greatly improved the earlier version of the paper. Also the authors would like to thank Mr.Kapil for his assistance in running the codes of SVR and LSSVR.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Balasundaram, S., Singh, R. On finite Newton method for support vector regression. Neural Comput & Applic 19, 967–977 (2010). https://doi.org/10.1007/s00521-010-0361-0
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-010-0361-0