Abstract
Support vector quantile regression (SVQR) adapts the flexible pinball loss function for empirical risk in regression problems. Furthermore, \(\varepsilon -\)SVQR obtains sparsity by introducing the \(\varepsilon -\)insensitive approach to SVQR. Despite their excellent generalisation performance, the employed loss functions of SVQR and \(\varepsilon -\)SVQR still possess noise and outlier sensitivity. This paper suggests a new robust SVQR model called a robust support vector quantile regression with truncated pinball loss (RSVQR). RSVQR employs a truncated pinball loss function for reducing the impact of noise. The employed loss function takes a non-convex structure which might lead to a local optimum solution. Further, to solve the non-convex optimization problem formulated using the employed non-convex loss function, we apply the concave–convex procedure (CCCP) to the cost function of the proposed method which decomposes the total loss into one convex and one concave part. Few interesting artificial and real-world datasets are considered for the experimental analysis. Support vector regression (SVR), Huber loss-based SVR (HSVR), asymmetric HSVR (AHSVR), SVQR and \(\varepsilon -\)SVQR are used to compare the results of the suggested model. The obtained results reveal the applicability of the proposed RSVQR model.
Similar content being viewed by others
References
Alcalá-Fdez J, Fernández A, Luengo J, Derrac J, García S, Sánchez L, Herrera F (2011) Keel data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J Multip Valued Logic Soft Comput: 17
Anand P, Rastogi R, Chandra S (2020) A new asymmetric ϵ-insensitive pinball loss function based support vector quantile regression model. Appl Soft Comput:106473
Awad M, Khanna R (2015) Support vector regression. In: Efficient learning machines. Apress, Berkeley, pp. 67–80
Bache K, Lichman M (2013) UCI machine learning repository. University of California, Irvine, School of Information and Computer Sciences (2013). URL: http://archive.ics.uci.edu/ml, 0162–8828.
Balasundaram S, Gupta D (2014) Lagrangian support vector regression via unconstrained convex minimization. Neural Netw 51:67–79
Balasundaram S, Meena Y (2019) Robust support vector regression in primal with asymmetric Huber loss. Neural Process Lett 49(3):1399–1431
Borah P, Gupta D (2020) Functional iterative approaches for solving support vector classification problems based on generalized Huber loss. Neural Comput Appl 32(13):9245–9265
Burges CJ (1998) A tutorial on support vector machines for pattern recognition. Data Min Knowl Disc 2(2):121–167
Collobert R, Sinz F, Weston J, Bottou L (2006) Trading convexity for scalability. In: Proceedings of the 23rd international conference on Machine learning (pp. 201–208).
Cortes C, Vapnik V (1995) Support-vector networks. Mach Learn 20(3):273–297
Cristianini N, Shawe-Taylor J (2000) An introduction to support vector machines and other kernel-based learning methods. Cambridge University Press, Cambridge
Demšar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7(Jan):1–30
Drucker H, Burges CJ, Kaufman L, Smola AJ, Vapnik V (1997) Support vector regression machines. In: Advances in neural information processing systems (pp. 155–161).
Gupta D, Hazarika BB, Berlin M (2020) Robust regularized extreme learning machine with asymmetric Huber loss function. Neural Comput Appl 32(16):12971–12998
Gupta D, Hazarika BB, Berlin M, Sharma UM, Mishra K (2021) Artificial intelligence for suspended sediment load prediction: a review. Environ Earth Sci 80(9):1–39
Hazarika BB, Gupta D (2021) Density-weighted support vector machines for binary class imbalance learning. Neural Comput Appl 33(9):4243–4261
Hsia JY, Lin CJ (2020) Parameter selection for linear support vector regression. IEEE Trans Neural Netw Learn Syst 31(12):5639–5644
Huang X, Shi L, Suykens JA (2014) Ramp loss linear programming support vector machine. J Mach Learn Res 15(1):2185–2211
Hwang CH (2010) M-quantile regression using kernel machine technique. J Korean Data Inform Sci Soc 21(5):973–981
Hwang H (2014) Support vector quantile regression for autoregressive data. J Korean Data Inform Sci Soc 25:1539–1547
Keerthi SS, Lin CJ (2003) Asymptotic behaviors of support vector machines with Gaussian kernel. Neural Comput 15(7):1667–1689
Kim S, Kim H (2016) A new metric of absolute percentage error for intermittent demand forecasts. Int J Forecast 32(3):669–679
Koenker R (2005) Quantile regression. Cambridge University.
Koenker R, Bassett Jr G (1978) Regression quantiles. Economet J Econom Soc:33–50.
Koenker R, Hallock KF (2001) Quantile Regression Journal of Economic Perspectives 15(4):143–156
Lipp T, Boyd S (2016) Variations and extension of the convex–concave procedure. Optim Eng 17:263–287
Mangasarian OL, Musicant DR (2000) Robust linear and support vector regression. IEEE Trans Pattern Anal Mach Intell 22(9):950–955
Mehr AD, Nourani V, Khosrowshahi VK, Ghorbani MA (2019) A hybrid support vector regression–firefly model for monthly rainfall forecasting. Int J Environ Sci Technol 16(1):335–346
Mehrkanoon S, Huang X, Suykens JA (2014) Non-parallel support vector classifiers with different loss functions. Neurocomputing 143:294–301
Niu J, Chen J, Xu Y (2017) Twin support vector regression with Huber loss. J Intell Fuzzy Syst 32(6):4247–4258
Peng X, Xu D (2016) Projection support vector regression algorithms for data regression. Knowl Based Syst 112:54–66
Rastogi R, Pal A, Chandra S (2018) Generalized Pinball Loss SVMs. Neurocomputing 322:151–165
Seok KH, Cho D, Hwang C, Shim J (2010) Support vector quantile regression using asymmetric e-insensitive loss function. In: 2010 2nd International conference on education technology and computer, vol 1. IEEE, pp V1-438
Shen X, Tseng GC, Zhang X, Wong WH (2003) On ψ-learning. J Am Stat Assoc 98(463):724–734
Shen X, Niu L, Qi Z, Tian Y (2017) Support vector machine classifier with truncated pinball loss. Pattern Recogn 68:199–210
Smola AJ, Schölkopf B (2004) A tutorial on support vector regression. Stat Comput 14(3):199–222
Sriperumbudur BK, Lanckriet GR (2012) A proof of convergence of the concave-convex procedure using Zangwill’s theory. Neural Comput 24(6):1391–1407
Steinwart I, Scovel C (2005) Fast rates to bayes for kernel machines. In: Advances in neural information processing systems (pp. 1345–1352).
Steinwart I, Christmann A (2011) Estimating conditional quantiles with the help of the pinball loss. Bernoulli 17(1):211–225
Takeuchi I, Le QV, Sears TD, Smola AJ (2006) Nonparametric quantile estimation. J Mach Learn Res 7(Jul):1231–1264
Tanveer M, Sharma A, Suganthan PN (2019) General twin support vector machine with pinball loss function. Inf Sci 494:311–327
Wu Q (2010) A hybrid-forecasting model based on Gaussian support vector machine and chaotic particle swarm optimization. Expert Syst Appl 37(3):2388–2394
Wu Y, Liu Y (2007) Robust truncated hinge loss support vector machines. J Am Stat Assoc 102(479):974–983
Wu Q, Yan HS (2009) Product sales forecasting model based on robust ν-support vector machine. Comput Integr Manuf Syst 15(06):1081–1087
Xu S, An X, Qiao X, Zhu L, Li L (2013) Multi-output least-squares support vector regression machines. Pattern Recogn Lett 34(9):1078–1084
Xu Q, Zhang J, Jiang C, Huang X, He Y (2015) Weighted quantile regression via support vector machine. Expert Syst Appl 42(13):5441–5451
Yu K, Lu Z, Stander J (2003) Quantile regression: applications and current research areas. J R Stat Soc Ser D (The Statist) 52(3):331–350
Yuille AL, Rangarajan A (2002) The concave-convex procedure (CCCP). In: Advances in neural information processing systems (pp. 1033–1040).
Zhao YP, Sun JG (2010) Robust truncated support vector regression. Expert Syst Appl 37(7):5126–5133
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors have no conflict of interest.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Hazarika, B.B., Gupta, D. & Borah, P. Robust support vector quantile regression with truncated pinball loss (RSVQR). Comp. Appl. Math. 42, 283 (2023). https://doi.org/10.1007/s40314-023-02402-x
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s40314-023-02402-x
Keywords
- Quantile regression
- Support vector regression
- Loss function
- Truncated pinball loss
- Concave–convex procedure