Skip to main content

Multi-scale Support Vector Machine for Regression Estimation

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3971))

Abstract

Recently, SVMs are wildly applied to regression estimation, but the existing algorithms leave the choice of the kernel type and kernel parameters to the user. This is the main reason for regression performance degradation, especially for the complicated data even the nonlinear and non-stationary data. By introducing the ‘empirical mode decomposition (EMD)’ method, with which any complicated data set can be decomposed into a finite and often small number of ‘intrinsic mode functions’ (IMFs) based on the local characteristic time scale of the data, this paper propose an important extension to the SVM method: multi-scale support vector machine based on EMD, in which several kernels of different scales can be used simultaneously to approximate the target function in different scales. Experiment results demonstrate the effectiveness of the proposed method.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.: The Nature of Statistical Learning Theory. Springer, New York (1999)

    Google Scholar 

  2. Tan, Y., Wang, J.: A Support Vector Machine with a Hybrid Kernel. IEEE Transactions on Knowledge and Data Engineering 16, 385–395 (2004)

    Article  Google Scholar 

  3. Schoelkopf, B., Smola, A., Mueller, K.R.: Nonlinear Component Analysis as a Kernel Eigenvalue Problem. Neural Computation 10, 1299–1319 (1998)

    Article  Google Scholar 

  4. Shao, X., Cherkassky, V.: Multi-Resolution Support Vector Machine. International Joint Conference on Neural Networks 2, 1065–1070 (1999)

    Google Scholar 

  5. Huang, N.E., Shen, Z., Long, S.R., et al.: The Empirical Mode Decomposition and the Hilbert Spectrum for Nonlinear and Non-Stationary Time Series Analysis, pp. 903–995. The Royal Society (1998)

    Google Scholar 

  6. Burges, C.J.C.: A Tutorial on Support Vector Machines for Pattern Recognition. Data Mining and Knowledge Discovery, 121–167 (1998)

    Google Scholar 

  7. Zhang, L., Zhou, W., Jiao, L.: Wavelet Support Vector Machine. IEEE Transactions on Systems, Man, and Cybernetics-Part B: Cybernetics 34, 34–39 (2004)

    Article  Google Scholar 

  8. Poggio, T., Rifkin, R., Mukherjee, S., Niyogi, P.: General Conditions for Predictivity in Learning Theory. NATURE 428, 419–422 (2004)

    Article  Google Scholar 

  9. Chen, S., Hong, X., Harris, C.J.: Sparse Kernel Density Construction Using Orthogonal Forward Regression with Leave-One-Out Test Score and Local Regularization. IEEE Transactions on Systems, Man, and Cybernetics-Part B: Cybernetics 34, 1708–1717 (2004)

    Article  Google Scholar 

  10. Joachims, T.: Estimating the Generalization Performance of a SVM Efficiently. In: Proc. 17th Int. Conf. Machine Learning, San Francisco (2000)

    Google Scholar 

  11. Kearns, M., Ron, D.: Algorithmic Stability and Sanity-Check Bounds for Leave-one-out Cross Validation. In: Proc. Tenth Conf. Compute Learning Theory, pp. 152–162 (1997)

    Google Scholar 

  12. Zheng, C., Jiao, L.: Automatic Parameters Selection for SVM Based on GA. In: Proceedings of the Fifth World Congress on Intelligent Control and Automation, pp. 1869–1872 (2004)

    Google Scholar 

  13. Liao, S.P., Lin, H.T., Lin, C.J.: A Note on the Decomposition Methods for Support Vector Regression. Neural Networks, 1474–1479 (2001)

    Google Scholar 

  14. Li, J., Zhang, B., Lin, F.: Nonlinear Speech Model Based on Support Vector Machine and Wavelet Transform. Tools with Artificial Intelligence, 259–263 (2003)

    Google Scholar 

  15. Chen, C.H., Li, C.P., Teng, T.L.: Surface-Wave Dispersion Measurements Using Hilbert-Huang Transform. Terrestrial, Atmospheric and Oceanic Sciences (TAO) 13, 171–184 (2002)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Yang, Z., Guo, J., Xu, W., Nie, X., Wang, J., Lei, J. (2006). Multi-scale Support Vector Machine for Regression Estimation. In: Wang, J., Yi, Z., Zurada, J.M., Lu, BL., Yin, H. (eds) Advances in Neural Networks - ISNN 2006. ISNN 2006. Lecture Notes in Computer Science, vol 3971. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11759966_151

Download citation

  • DOI: https://doi.org/10.1007/11759966_151

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-34439-1

  • Online ISBN: 978-3-540-34440-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics