Skip to main content
Log in

On finite Newton method for support vector regression

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In this paper, we propose a Newton iterative method of solution for solving an ε-insensitive support vector regression formulated as an unconstrained optimization problem. The proposed method has the advantage that the solution is obtained by solving a system of linear equations at a finite number of times rather than solving a quadratic optimization problem. For the case of linear or kernel support vector regression, the finite termination of the Newton method has been proved. Experiments were performed on IBM, Google, Citigroup and Sunspot time series. The proposed method converges in at most six iterations. The results are compared with that of the standard, least squares and smooth support vector regression methods and of the exact solutions clearly demonstrate the effectiveness of the proposed method.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Bao Y-K, Liu Z-T, Guo L, Wang W (2005) Forecasting stock composite index by fuzzy support vector machines regression. In: Proceedings of 4th international conference on machine learning and cybernetics, Guangzhou, pp 18–21

  2. Brockwell PJ, Davis RA (2002) Introduction to time series forecasting, 2nd edn. Springer, Berlin

    MATH  Google Scholar 

  3. Cao LJ (2003) Support vector machines experts for time series forecasting. Neurocomputing 51:321–339

    Article  Google Scholar 

  4. Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines and other Kernel based learning methods. Cambridge University Press, Cambridge

    Google Scholar 

  5. Chen KY, Wang CH (2007) A hybrid SARIMA and support vector machines for forecasting the production values of the machinery industry in Taiwan. Expert Syst Appl 32(1):254–264

    Article  Google Scholar 

  6. Demiriz A, Bennett K, Breneman C, Embrechts M (2001) Support vector machine regression in chemometrics. Comp Sci Stat. http://www.rpi.edu/~bennek/QSARINT.ps.gz

  7. Fung G, Mangasarian OL (2001) Proximal support vector machine classifiers. In: Proceedings of 7th ACMSIGKDD international conference on knowledge discovery and data mining, pp 77–86

  8. Fung G, Mangasarian OL (2003) Finite Newton method for Lagrangian support vector machine. Neurocomputing 55:39–55

    Article  Google Scholar 

  9. Gunn SR (1998) Support vector machines for classification and regression. Technical Report, School of Electronics and Computer Science, University of Southampton, Southampton, UK. http://www.isis.ecs.soton.ac.uk/resources/svminfo/

  10. Hiriart-Urruty J-B, Strodiot JJ, Nguyen VH (1984) Generalized Hessian matrix and second-order optimality conditions for problems with CL1 data. Appl Math Optim 11:43–56

    Article  MATH  MathSciNet  Google Scholar 

  11. Lee YJ, Mangasarian OL (2001) SSVM: a smooth support vector machine for classification. Comput Optim Appl 20(1):5–22

    Article  MATH  MathSciNet  Google Scholar 

  12. Lee YJ, Hsieh W-F, Huang C-M (2005) ε-SSVR: a smooth support vector machine for ɛ-insensitive regression. IEEE Trans Knowl Data Eng 17(5):678–685

    Article  Google Scholar 

  13. Mangasarian OL (1995) Parallel gradient distribution in unconstrained optimization. SIAM J Control Optim 33(6):1916–1925

    Article  MATH  MathSciNet  Google Scholar 

  14. Mangasarian OL, Musicant DR (2001) Active set support vector machine classification. In: Leen TK, Dietterich TG, Tesp V (eds) Advances in neural information processing systems, vol 13. MIT Press, Cambridge, pp 577–586

    Google Scholar 

  15. Mangasarian OL (2002) A finite newton method for classification. Optim Methods Software 17:913–929

    Article  MATH  MathSciNet  Google Scholar 

  16. Mukherjee S, Osuna E, Girosi F (1997) Nonlinear prediction of chaotic time series using support vector machines. In: NNSP’97: neural networks for signal processing VII: proceedings of IEEE signal processing society workshop, Amelia Island, FL, USA, pp 511–520

  17. Muller KR, Smola AJ, Ratsch G, Schölkopf B, Kohlmorgen J (1999) Using support vector machines for time series prediction. In: Schölkopf B, Burges CJC, Smola AJ (eds) Advances in Kernel methods-support vector learning. MIT Press, Cambridge, pp 243–254

    Google Scholar 

  18. Musicant DR, Feinberg A (2004) Active set support vector regression. IEEE Trans Neural Networks 15(2):268–275

    Article  Google Scholar 

  19. Osuna E, Freund R, Girosi F (1997) Training support vector machines: an application to face detection. In: Proceedings on computer vision and pattern recognition, pp 130–136

  20. Suykens JAK, Van Gestel T, De Brabanter J, De Moor B, Vandewalle J (2002) Least squares support vector machines. World Scientific, Singapore

    Book  MATH  Google Scholar 

  21. Tang Z, Fishwick PA (1993) Feed forward neural nets as models for time series forecasting. ORSA J Comput 5:374–385

    MATH  Google Scholar 

  22. Tay FEH, Cao LJ (2001) Application of support vector machines in financial time series with forecasting. Omega 29(4):309–317

    Article  Google Scholar 

  23. Tong Q, Zheng H, Wang X (2005) Gene prediction algorithm based on the statistical combination and the classification in terms of gene characteristics. In: International conference on neural networks and brain, vol 2, pp 673–677

  24. Trafalis TB, Ince H (2000) Support vector machine for regression and applications to financial forecasting. Proc IEEE INNSENNS Int Joint Conf 16:348–353

    Google Scholar 

  25. Tseng FM, Yu HC, Tzeng GH (2002) Combining neural network model with seasonal time series ARIMA model. Technol Forecasting Social Change 69:71–87

    Article  Google Scholar 

  26. Vapnik VN (1999) An overview of statistical learning theory. IEEE Trans Neural Networks 10(5):988–999

    Article  Google Scholar 

  27. Vapnik VN (2000) The nature of statistical learning theory, 2nd edn. Springer, New York

    MATH  Google Scholar 

  28. Zhang GP, Patuwo EB, Hu MY (2001) A simulation study of artificial neural networks for nonlinear time series forecasting. Comput Oper Res 28:381–396

    Article  MATH  Google Scholar 

  29. Zhang GP (2003) Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing 50:159–175

    Article  MATH  Google Scholar 

Download references

Acknowledgments

The authors would like to thank the referee for the valuable comments which greatly improved the earlier version of the paper. Also the authors would like to thank Mr.Kapil for his assistance in running the codes of SVR and LSSVR.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to S. Balasundaram.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Balasundaram, S., Singh, R. On finite Newton method for support vector regression. Neural Comput & Applic 19, 967–977 (2010). https://doi.org/10.1007/s00521-010-0361-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-010-0361-0

Keywords

Navigation