Skip to main content

Interval Regression Analysis with Soft-Margin Reduced Support Vector Machine

  • Conference paper
Next-Generation Applied Intelligence (IEA/AIE 2009)

Abstract

The support vector machine (SVM) has shown to be an efficient approach for a variety of classification problems. It has also been widely used in pattern recognition, regression and distribution estimation for crisp data. However, there are three main problems while using SVM model: (1) Large-scale: when dealing with large-scale data sets, the solution by using SVM with nonlinear kernels may be difficult to find; (2) Unbalance: the number of samples from one class is much larger than the number of samples from other classes. It causes the excursion of separation margin; (3) Noises and Interaction: the distribution of data becomes hard to be described and the separation margin between classes becomes a “gray” zone. Under this circumstance, to develop an efficient method is necessary. Recently the reduced support vector machine (RSVM) was proposed as an alternative of the standard SVM. It has been proved more efficient than the traditional SVM in processing large-scaled data. In this paper we introduce the principle of RSVM to evaluate interval regression analysis. In addition the soft margin method is proposed to modify the excursion of separation margin and to be effective in the gray zone.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Burges, C.J.C.: A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining and Knowledge Discovery 2, 121–167 (1998)

    Article  Google Scholar 

  2. Cortes, C., Vapnik, V.N.: Support Vector Networks. Machine Learning 20, 273–297 (1995)

    MATH  Google Scholar 

  3. Drucker, H., Burges, C.J.C., Kaufman, L., Smola, A., Vapnik, V.N.: Support Vector Regression Machines. In: Mozer, M., Jordan, M., Petsche, T. (eds.) Advances in Neural Information Processing Systems, vol. 9, pp. 155–161. MIT Press, Cambridge (1997)

    Google Scholar 

  4. Hong, D.H., Hwang, C.H.: Support Vector Fuzzy Regression Machines. Fuzzy Sets and Systems 138, 271–281 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  5. Hong, D.H., Hwang, C.H.: Interval Regression Analysis using Quadratic Loss Support Vector Machine. IEEE Transactions on Fuzzy Systems 13, 229–237 (2005)

    Article  Google Scholar 

  6. Jeng, J.T., Chuang, C.C., Su, S.F.: Support Vector Interval Regression Networks for Interval Regression Analysis. Fuzzy Sets and Systems 138, 283–300 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  7. Kacprzyk, J., Fedrizzi, M.: Fuzzy Regression Analysis. Physica-Verlag, Heidelberg (1992)

    MATH  Google Scholar 

  8. Lee, Y.J., Mangasarian, O.L.: RSVM: Reduced Support Vector Machines. In: Proceedings of 1st SIAM International Conference on Data Mining (2001)

    Google Scholar 

  9. Mangasarian, O.L.: Generalized Support Vector Machines. In: Smola, A.J., Bartlett, P.L., Schölkopf, B., Schuurmans, D. (eds.) Advances in Large Margin Classifiers, pp. 135–146. MIT Press, Cambridge (2000)

    Google Scholar 

  10. Micchelli, C.A.: Interpolation of Scattered Data: Distance Matrices and Conditionally Positive Definite Functions. Constructive Approximation 2, 11–22 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  11. Schölkopf, B., Burges, C.J.C., Smola, A.J. (eds.): Advances in Kernel Methods: Support Vector Learning. MIT Press, Cambridge (1999)

    Google Scholar 

  12. Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2002)

    Google Scholar 

  13. Smola, A.J., Schölkopf, B.: A Tutorial on Support Vector Regression. NeuroCOLT2 Tech. Report, NeuroCOLT (1998); Statistics and Computing 14, 199–222 (2004)

    Google Scholar 

  14. Tanaka, H., Lee, H.: Interval Regression Analysis by Quadratic Programming Approach. IEEE Transactions on Fuzzy Systems 6, 473–481 (1998)

    Article  Google Scholar 

  15. Tanaka, H., Uejima, S., Asai, K.: Fuzzy Linear Regression Model. IEEE Transactions on Systems, Man and Cybernetics 10, 2933–2938 (1980)

    Google Scholar 

  16. Vapnik, V.N.: Statistical Learning Theory. John Wiley and Sons, Inc., New York (1998)

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Huang, CH., Kao, HY. (2009). Interval Regression Analysis with Soft-Margin Reduced Support Vector Machine. In: Chien, BC., Hong, TP., Chen, SM., Ali, M. (eds) Next-Generation Applied Intelligence. IEA/AIE 2009. Lecture Notes in Computer Science(), vol 5579. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-02568-6_84

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-02568-6_84

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-02567-9

  • Online ISBN: 978-3-642-02568-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics