Skip to main content
Log in

Robust support vector quantile regression with truncated pinball loss (RSVQR)

  • Published:
Computational and Applied Mathematics Aims and scope Submit manuscript

Abstract

Support vector quantile regression (SVQR) adapts the flexible pinball loss function for empirical risk in regression problems. Furthermore, \(\varepsilon -\)SVQR obtains sparsity by introducing the \(\varepsilon -\)insensitive approach to SVQR. Despite their excellent generalisation performance, the employed loss functions of SVQR and \(\varepsilon -\)SVQR still possess noise and outlier sensitivity. This paper suggests a new robust SVQR model called a robust support vector quantile regression with truncated pinball loss (RSVQR). RSVQR employs a truncated pinball loss function for reducing the impact of noise. The employed loss function takes a non-convex structure which might lead to a local optimum solution. Further, to solve the non-convex optimization problem formulated using the employed non-convex loss function, we apply the concave–convex procedure (CCCP) to the cost function of the proposed method which decomposes the total loss into one convex and one concave part. Few interesting artificial and real-world datasets are considered for the experimental analysis. Support vector regression (SVR), Huber loss-based SVR (HSVR), asymmetric HSVR (AHSVR), SVQR and \(\varepsilon -\)SVQR are used to compare the results of the suggested model. The obtained results reveal the applicability of the proposed RSVQR model.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig.1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Data availability

The datasets are collected from the UCI machine learning repository (Bache and Lichman 2013) and the KEEL data repository (Alcalá-Fdez et al. 2011).

References

  • Alcalá-Fdez J, Fernández A, Luengo J, Derrac J, García S, Sánchez L, Herrera F (2011) Keel data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J Multip Valued Logic Soft Comput: 17

  • Anand P, Rastogi R, Chandra S (2020) A new asymmetric ϵ-insensitive pinball loss function based support vector quantile regression model. Appl Soft Comput:106473

  • Awad M, Khanna R (2015) Support vector regression. In: Efficient learning machines. Apress, Berkeley, pp. 67–80

  • Bache K, Lichman M (2013) UCI machine learning repository. University of California, Irvine, School of Information and Computer Sciences (2013). URL: http://archive.ics.uci.edu/ml, 0162–8828.

  • Balasundaram S, Gupta D (2014) Lagrangian support vector regression via unconstrained convex minimization. Neural Netw 51:67–79

    Article  MATH  Google Scholar 

  • Balasundaram S, Meena Y (2019) Robust support vector regression in primal with asymmetric Huber loss. Neural Process Lett 49(3):1399–1431

    Article  Google Scholar 

  • Borah P, Gupta D (2020) Functional iterative approaches for solving support vector classification problems based on generalized Huber loss. Neural Comput Appl 32(13):9245–9265

    Article  Google Scholar 

  • Burges CJ (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Disc 2(2):121–167

    Article  Google Scholar 

  • Collobert R, Sinz F, Weston J, Bottou L (2006) Trading convexity for scalability. In: Proceedings of the 23rd international conference on Machine learning (pp. 201–208).

  • Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297

    Article  MATH  Google Scholar 

  • Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, Cambridge

    Book  MATH  Google Scholar 

  • Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7(Jan):1–30

    MathSciNet  MATH  Google Scholar 

  • Drucker H, Burges CJ, Kaufman L, Smola AJ, Vapnik V (1997) Support vector regression machines. In: Advances in neural information processing systems (pp. 155–161).

  • Gupta D, Hazarika BB, Berlin M (2020) Robust regularized extreme learning machine with asymmetric Huber loss function. Neural Comput Appl 32(16):12971–12998

    Article  Google Scholar 

  • Gupta D, Hazarika BB, Berlin M, Sharma UM, Mishra K (2021) Artificial intelligence for suspended sediment load prediction: a review. Environ Earth Sci 80(9):1–39

    Article  Google Scholar 

  • Hazarika BB, Gupta D (2021) Density-weighted support vector machines for binary class imbalance learning. Neural Comput Appl 33(9):4243–4261

    Article  Google Scholar 

  • Hsia JY, Lin CJ (2020) Parameter selection for linear support vector regression. IEEE Trans Neural Netw Learn Syst 31(12):5639–5644

    Article  MathSciNet  Google Scholar 

  • Huang X, Shi L, Suykens JA (2014) Ramp loss linear programming support vector machine. J Mach Learn Res 15(1):2185–2211

    MathSciNet  MATH  Google Scholar 

  • Hwang CH (2010) M-quantile regression using kernel machine technique. J Korean Data Inform Sci Soc 21(5):973–981

    Google Scholar 

  • Hwang H (2014) Support vector quantile regression for autoregressive data. J Korean Data Inform Sci Soc 25:1539–1547

    Google Scholar 

  • Keerthi SS, Lin CJ (2003) Asymptotic behaviors of support vector machines with Gaussian kernel. Neural Comput 15(7):1667–1689

    Article  MATH  Google Scholar 

  • Kim S, Kim H (2016) A new metric of absolute percentage error for intermittent demand forecasts. Int J Forecast 32(3):669–679

    Article  Google Scholar 

  • Koenker R (2005) Quantile regression. Cambridge University.

  • Koenker R, Bassett Jr G (1978) Regression quantiles. Economet J Econom Soc:33–50.

  • Koenker R, Hallock KF (2001) Quantile Regression Journal of Economic Perspectives 15(4):143–156

    Article  Google Scholar 

  • Lipp T, Boyd S (2016) Variations and extension of the convex–concave procedure. Optim Eng 17:263–287

    Article  MathSciNet  MATH  Google Scholar 

  • Mangasarian OL, Musicant DR (2000) Robust linear and support vector regression. IEEE Trans Pattern Anal Mach Intell 22(9):950–955

    Article  Google Scholar 

  • Mehr AD, Nourani V, Khosrowshahi VK, Ghorbani MA (2019) A hybrid support vector regression–firefly model for monthly rainfall forecasting. Int J Environ Sci Technol 16(1):335–346

    Article  Google Scholar 

  • Mehrkanoon S, Huang X, Suykens JA (2014) Non-parallel support vector classifiers with different loss functions. Neurocomputing 143:294–301

    Article  Google Scholar 

  • Niu J, Chen J, Xu Y (2017) Twin support vector regression with Huber loss. J Intell Fuzzy Syst 32(6):4247–4258

    Article  MATH  Google Scholar 

  • Peng X, Xu D (2016) Projection support vector regression algorithms for data regression. Knowl Based Syst 112:54–66

    Article  Google Scholar 

  • Rastogi R, Pal A, Chandra S (2018) Generalized Pinball Loss SVMs. Neurocomputing 322:151–165

    Article  Google Scholar 

  • Seok KH, Cho D, Hwang C, Shim J (2010) Support vector quantile regression using asymmetric e-insensitive loss function. In: 2010 2nd International conference on education technology and computer, vol 1. IEEE, pp V1-438

  • Shen X, Tseng GC, Zhang X, Wong WH (2003) On ψ-learning. J Am Stat Assoc 98(463):724–734

    Article  MATH  Google Scholar 

  • Shen X, Niu L, Qi Z, Tian Y (2017) Support vector machine classifier with truncated pinball loss. Pattern Recogn 68:199–210

    Article  Google Scholar 

  • Smola AJ, Schölkopf B (2004) A tutorial on support vector regression. Stat Comput 14(3):199–222

    Article  MathSciNet  Google Scholar 

  • Sriperumbudur BK, Lanckriet GR (2012) A proof of convergence of the concave-convex procedure using Zangwill’s theory. Neural Comput 24(6):1391–1407

    Article  MathSciNet  MATH  Google Scholar 

  • Steinwart I, Scovel C (2005) Fast rates to bayes for kernel machines. In: Advances in neural information processing systems (pp. 1345–1352).

  • Steinwart I, Christmann A (2011) Estimating conditional quantiles with the help of the pinball loss. Bernoulli 17(1):211–225

    Article  MathSciNet  MATH  Google Scholar 

  • Takeuchi I, Le QV, Sears TD, Smola AJ (2006) Nonparametric quantile estimation. J Mach Learn Res 7(Jul):1231–1264

    MathSciNet  MATH  Google Scholar 

  • Tanveer M, Sharma A, Suganthan PN (2019) General twin support vector machine with pinball loss function. Inf Sci 494:311–327

    Article  MathSciNet  MATH  Google Scholar 

  • Wu Q (2010) A hybrid-forecasting model based on Gaussian support vector machine and chaotic particle swarm optimization. Expert Syst Appl 37(3):2388–2394

    Article  Google Scholar 

  • Wu Y, Liu Y (2007) Robust truncated hinge loss support vector machines. J Am Stat Assoc 102(479):974–983

    Article  MathSciNet  MATH  Google Scholar 

  • Wu Q, Yan HS (2009) Product sales forecasting model based on robust ν-support vector machine. Comput Integr Manuf Syst 15(06):1081–1087

    Google Scholar 

  • Xu S, An X, Qiao X, Zhu L, Li L (2013) Multi-output least-squares support vector regression machines. Pattern Recogn Lett 34(9):1078–1084

    Article  Google Scholar 

  • Xu Q, Zhang J, Jiang C, Huang X, He Y (2015) Weighted quantile regression via support vector machine. Expert Syst Appl 42(13):5441–5451

    Article  Google Scholar 

  • Yu K, Lu Z, Stander J (2003) Quantile regression: applications and current research areas. J R Stat Soc Ser D (The Statist) 52(3):331–350

    MathSciNet  Google Scholar 

  • Yuille AL, Rangarajan A (2002) The concave-convex procedure (CCCP). In: Advances in neural information processing systems (pp. 1033–1040).

  • Zhao YP, Sun JG (2010) Robust truncated support vector regression. Expert Syst Appl 37(7):5126–5133

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Deepak Gupta.

Ethics declarations

Conflict of interest

The authors have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Hazarika, B.B., Gupta, D. & Borah, P. Robust support vector quantile regression with truncated pinball loss (RSVQR). Comp. Appl. Math. 42, 283 (2023). https://doi.org/10.1007/s40314-023-02402-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40314-023-02402-x

Keywords

Mathematics Subject Classification

Navigation