Skip to main content
Log in

A New Support Vector Machine Plus with Pinball Loss

  • Published:
Journal of Classification Aims and scope Submit manuscript

Abstract

The hinge loss support vector machine (SVM) is sensitive to outliers. This paper proposes a new support vector machine with a pinball loss function (PSVM+). The new model is less sensitive to noise, especially the feature noise around the decision boundary. Furthermore, the PSVM+ is more stable than the hinge loss support vector machine plus (SVM+) for re-sampling. It also embeds the additional information into the corresponding optimization problem, which is helpful to further improve the learning performance. Meanwhile, the computational complexity of the PSVM+ is similar to that of the SVM+.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • BATISTA, G., and MONARD,M.C. (2003), “An Analysis of Four Missing Data Treatment Methods for Supervised Learning”, Applied Artificial Intelligence, 17, 519–533.

  • CAI, F., and CHERKASSKY, V. (2009), “SVM+ Regression and Multi-Task Learning”, in Proceedings of the IEEE International Joint Conference on Neural Networks, pp. 418–424.

  • CAI, F. (2011), “Advanced Learning Approaches Based on SVM+ Methodology”, PhD thesis, University of Minnesota.

  • CAO, L. (2003), “Support Vector Machines Experts for Time Series Forecasting”, Neurocomputing, 51, 321–339.

    Article  Google Scholar 

  • CHRISTMANN, A., and STEINWART, I. (2007), “How SVMs Can Estimate Quantiles and the Median” , Advances in Neural Information Processing System, 20(2), 305–312.

    Google Scholar 

  • DEMIRIZ, A., BENNETT, K.P., BRENEMAN, C.M., and EMBRECHTS, M.J. (2001), “Support Vector Machine Regression in Chemometrics, Computing Science and Statistics”, in Proceedings of the 33rd Symposium on the Interface, American Statistical Association for the Interface Foundation of North America, Washington, D.C.

  • FENG, J., and WILLIAMS, P. (2001), “The Generalization Error of the Symmetric and Scaled Support Vector Machines”, IEEE Transactions on Neural Networks, 12(5), 1255–1260.

    Article  Google Scholar 

  • HUANG, H., and LIU, Y. (2012), “Fuzzy Support Vector Machines for Pattern Recognition and Data Mining”, International Journal of Fuzzy Systems, 4, 3–12.

    MathSciNet  Google Scholar 

  • HUANG, G., SONG, S., WU, C., and YOU, K. (2012), “Robust Support Vector Regression for Uncertain Input and Output Data”, IEEE Transactions on Neural Networks and Learning Systems, 23, 1690–1700.

    Article  Google Scholar 

  • HUANG, H., SHI, L., and SUYKENS, J.A.K. (2014), “Support Vector Machine Classifier with Pinball Loss”, IEEE Transaction on Pattern Analysis and Machine Intelligence, 36(5), 984–997.

    Article  Google Scholar 

  • JIN, B., TANG, Y., and ZHANG, Y. (2007), “Support Vector Machines with Genetic Fuzzy Feature Transformation for Biomedical Data Classification”, Information Sciences, 177(2), 476–489.

    Article  Google Scholar 

  • JOACHIMS, T. (1998), “Text Categorization with Support Vector Machines: Learning with Many Relevant Features”, Machine Learning: ECML-98, 137–142.

  • KITAGAWA, G. (1996), “Monte-Carlo Filter and Smoother for Non-Gaussian Nonlinear State Space Models”, Journal of Computational and Graphical Statistics, 1(1), 1–25.

    MathSciNet  Google Scholar 

  • KOENKER, R. (2005), ‘Quantile Regression. Econometric Society Monographs, Cambridge University Press.

  • KUBICA, J., and MOORE, A. (2003), “Probabilistic Noise Identification and Data Cleaning”, in The Third IEEE International Conference on Data Mining, eds. X. Wu, A. Tuzhilin, and J. Shavlik, IEEE Computer Society, pp. 131–138.

  • LEIVA-MURILLO, J.M., GÓMEZ-CHOVA, L., and CAMPS-VALLS, G. (2013), “Multitask Remote Sensing Data Classification”, IEEE Transactions on Geoscience and Remote Sensing, 51(1), 151–161.

    Article  Google Scholar 

  • LIANG, L., and CHERKASSKY, V. (2008), “Connection Between SVM+ and Multi-Task Learning”, in IEEE International Joint Conference on Neural Networks, pp. 2048–2054.

  • LIANG, L., and CHERKASSKY, V. (2007), “Learning Using Structured Data: Application to FMRI Data Analysis”, in IEEE International Joint Conference on Neural Networks, pp. 495–499.

  • LICHMAN, M. (2013), “UCI Machine Learning Repository”, Irvine, CA: University of California, School of Information and Computer Science, http://archive.ics.uci.edu/ml.

    Google Scholar 

  • LINGRAS, P., and BUTZ, C. (2007), “Rough Set Based 1-v-1 and 1-v-r Approaches to Support Vector Machine Multi-Classification”, Information Sciences, 177(18), 3782–3798.

    Article  Google Scholar 

  • OSUNA, E., FREUND, R., and GIROSI, F.(1997), “Training Support Vector Machines: An Application to Face Detection”, in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CiteSeer, pp. 130–136.

  • ROUSSEAUW, J. ET AL. (1983). “Coronary Risk Factor Screening in Three Rural Communities”, South African Medical Journal, 64, 430-436.

    Google Scholar 

  • SUYKENS, J., DEBRABANTER, J., and LUKAS, L. (2002), “Weighted Least Squares Support Vector Machines: Robustness and Sparse Approximation”, Neurocomputing, 48, 85–105.

    Article  MATH  Google Scholar 

  • STEINWART, I., and CHRISTMANN, A. (2011), “Estimating Conditional Quantiles with the Help of the Pinball Loss”, Bernoulli, 17(1), 211–225.

    Article  MathSciNet  MATH  Google Scholar 

  • TRAFALIS, B., and GILBERT, C. (2006), “Robust Classification and Regression Using Support Vector Machines”, European Journal of Operational Research, 173, 893–909.

    Article  MathSciNet  MATH  Google Scholar 

  • VAPNIK, V. (1995), The Nature of Statistical Learning Theory, New York: Springer-Verlag.

    Book  MATH  Google Scholar 

  • VAPNIK, V. (1998), Statistical Learning Theory, New York: Wiley.

    MATH  Google Scholar 

  • VAPNIK, V. (2006), Empirical Inference Science, New York: Springer.

    MATH  Google Scholar 

  • VAPNIK, V., and VASHIST, A. (2009), “A New Learning Paradigm: Learning Using Privileged Information”, Neural Network, 22(5), 544–557.

    Article  MATH  Google Scholar 

  • VAPNIK, V., VASHIST, A., and PAVLOVITCH, N. (2009), “Learning Using Hidden Information (Learning with Teacher)”, in IEEE International Joint Conference on Neural Networks, pp. 3188–3195.

  • XU, Y., and WAND, L. (2012), “A Weighted Twin Support Vector Regression”, Knowledge-Based Systems, 33, 92–101.

    Article  Google Scholar 

  • YANG, C., CHOU, J., and LIIAN, F. (2013), “Robust Classifier Learning with Fuzzy Class Labels for Large-Margin Support Vector Machines”, Neurocomputing, 99, 1–14.

    Article  Google Scholar 

  • YOON, M., YUN, Y., and NAKAYAMA, H. (2003), “A Role of Total Margin in Support Vector Machines”, in International Joint Conference on Neural Networks, Seattle, 3, pp. 2049–2053.

  • ZHANG, J., and WANG, Y. (2008), “A Rough Margin Based Support Vector Machine”, Information Sciences, 178(9), 2204–2214.

    Article  MathSciNet  Google Scholar 

  • ZHONG, P., and FUKUSHIMA, M. (2007), “Second Order Cone Programming Formulations for Robust Multi-Class Classification”, Neural Computing, 19, 258–282.

    Article  MATH  Google Scholar 

  • ZHONG, P., and WANG, L. (2008), “Support Vector Regression with Input Data Uncertainty”, International Journal of Innovative Computing, Information and Control, 4, 2325–2332.

    Google Scholar 

  • ZHU, W.X., and ZHONG, P. (2014), “A New One-Class SVM Based on Hidden Information”, Knowledge-Based Systems, 60, 35–43.

    Article  Google Scholar 

  • ZHU, W.X., and ZHONG, P. (2015), “Minimum Class Nariance SVM+ for Data Classification”, Advances in Data Analysis and Classification, June.

  • ZHU, W.X., WANG, K.N., and ZHONG, P. (2014), “Improving Support Vector Classification by Learning Group Information Hidden in the Data”, ICIC Express Letters, Part B: Applications, 5(3), 781–786.

    Google Scholar 

  • ZHU, X., WU, X., and CHEN, S. (2003), “Eliminating Class Noise in Large Datasets”, in Proceedings of the 20th ICML International Conference on Machine Learning, Washington D.C., pp. 920–927.

  • ZHU, X., WU, X., and CHEN, S. (2006), “Bridging Local and Global Data Cleansing: Identifying Class Noise in Large, Distributed Data Datasets”, Data Mining and Knowledge Discovery, 12(2), 275–308.

    Article  MathSciNet  Google Scholar 

  • ZHU, X., WU, X., and YANG, Y. (2004), “Error Detection and Impact-Sensitive Instance Ranking in Noisy Datasets”, in Proceedings of 19th National Conference on Artificial Intelligence, San Jose, CA.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yingyuan Xiao.

Additional information

The work is supported by the Natural Science Foundation of China (No. 61170174) and Tianjin Training plan of University Innovation Team (No.TD 12-5016).

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhu, W., Song, Y. & Xiao, Y. A New Support Vector Machine Plus with Pinball Loss. J Classif 35, 52–70 (2018). https://doi.org/10.1007/s00357-018-9249-y

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00357-018-9249-y

Keywords

Navigation